Resting-state functional connectivity changes following audio-tactile speech training

Understanding speech in background noise is a challenging task, especially when the signal is also distorted. In a series of previous studies, we have shown that comprehension can improve if, simultaneously with auditory speech, the person receives speech-extracted low-frequency signals on their fin...

Full description

Saved in:
Bibliographic Details
Main Authors: Katarzyna Cieśla, Tomasz Wolak, Amir Amedi
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-04-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2025.1482828/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849310521431425024
author Katarzyna Cieśla
Katarzyna Cieśla
Katarzyna Cieśla
Tomasz Wolak
Amir Amedi
Amir Amedi
author_facet Katarzyna Cieśla
Katarzyna Cieśla
Katarzyna Cieśla
Tomasz Wolak
Amir Amedi
Amir Amedi
author_sort Katarzyna Cieśla
collection DOAJ
description Understanding speech in background noise is a challenging task, especially when the signal is also distorted. In a series of previous studies, we have shown that comprehension can improve if, simultaneously with auditory speech, the person receives speech-extracted low-frequency signals on their fingertips. The effect increases after short audio-tactile speech training. In this study, we used resting-state functional magnetic resonance imaging (rsfMRI) to measure spontaneous low-frequency oscillations in the brain while at rest to assess training-induced changes in functional connectivity. We observed enhanced functional connectivity (FC) within a right-hemisphere cluster corresponding to the middle temporal motion area (MT), the extrastriate body area (EBA), and the lateral occipital cortex (LOC), which, before the training, was found to be more connected to the bilateral dorsal anterior insula. Furthermore, early visual areas demonstrated a switch from increased connectivity with the auditory cortex before training to increased connectivity with a sensory/multisensory association parietal hub, contralateral to the palm receiving vibrotactile inputs, after training. In addition, the right sensorimotor cortex, including finger representations, was more connected internally after the training. The results altogether can be interpreted within two main complementary frameworks. The first, speech-specific, factor relates to the pre-existing brain connectivity for audio–visual speech processing, including early visual, motion, and body regions involved in lip-reading and gesture analysis under difficult acoustic conditions, upon which the new audio-tactile speech network might be built. The other framework refers to spatial/body awareness and audio-tactile integration, both of which are necessary for performing the task, including in the revealed parietal and insular regions. It is possible that an extended training period is necessary to directly strengthen functional connections between the auditory and the sensorimotor brain regions for the utterly novel multisensory task. The results contribute to a better understanding of the largely unknown neuronal mechanisms underlying tactile speech benefits for speech comprehension and may be relevant for rehabilitation in the hearing-impaired population.
format Article
id doaj-art-6bb00a86c0ff4438b02b2efb29fabc5e
institution Kabale University
issn 1662-453X
language English
publishDate 2025-04-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj-art-6bb00a86c0ff4438b02b2efb29fabc5e2025-08-20T03:53:42ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2025-04-011910.3389/fnins.2025.14828281482828Resting-state functional connectivity changes following audio-tactile speech trainingKatarzyna Cieśla0Katarzyna Cieśla1Katarzyna Cieśla2Tomasz Wolak3Amir Amedi4Amir Amedi5The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, Herzliya, IsraelThe Ruth and Meir Rosenthal Brain Imaging Center, Reichman University, Herzliya, IsraelWorld Hearing Centre, Institute of Physiology and Pathology of Hearing, Warsaw, PolandWorld Hearing Centre, Institute of Physiology and Pathology of Hearing, Warsaw, PolandThe Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, Herzliya, IsraelThe Ruth and Meir Rosenthal Brain Imaging Center, Reichman University, Herzliya, IsraelUnderstanding speech in background noise is a challenging task, especially when the signal is also distorted. In a series of previous studies, we have shown that comprehension can improve if, simultaneously with auditory speech, the person receives speech-extracted low-frequency signals on their fingertips. The effect increases after short audio-tactile speech training. In this study, we used resting-state functional magnetic resonance imaging (rsfMRI) to measure spontaneous low-frequency oscillations in the brain while at rest to assess training-induced changes in functional connectivity. We observed enhanced functional connectivity (FC) within a right-hemisphere cluster corresponding to the middle temporal motion area (MT), the extrastriate body area (EBA), and the lateral occipital cortex (LOC), which, before the training, was found to be more connected to the bilateral dorsal anterior insula. Furthermore, early visual areas demonstrated a switch from increased connectivity with the auditory cortex before training to increased connectivity with a sensory/multisensory association parietal hub, contralateral to the palm receiving vibrotactile inputs, after training. In addition, the right sensorimotor cortex, including finger representations, was more connected internally after the training. The results altogether can be interpreted within two main complementary frameworks. The first, speech-specific, factor relates to the pre-existing brain connectivity for audio–visual speech processing, including early visual, motion, and body regions involved in lip-reading and gesture analysis under difficult acoustic conditions, upon which the new audio-tactile speech network might be built. The other framework refers to spatial/body awareness and audio-tactile integration, both of which are necessary for performing the task, including in the revealed parietal and insular regions. It is possible that an extended training period is necessary to directly strengthen functional connections between the auditory and the sensorimotor brain regions for the utterly novel multisensory task. The results contribute to a better understanding of the largely unknown neuronal mechanisms underlying tactile speech benefits for speech comprehension and may be relevant for rehabilitation in the hearing-impaired population.https://www.frontiersin.org/articles/10.3389/fnins.2025.1482828/fullspeech comprehensiontactile aidmultisensory trainingfMRIresting-state functional MRIcochlear implants
spellingShingle Katarzyna Cieśla
Katarzyna Cieśla
Katarzyna Cieśla
Tomasz Wolak
Amir Amedi
Amir Amedi
Resting-state functional connectivity changes following audio-tactile speech training
Frontiers in Neuroscience
speech comprehension
tactile aid
multisensory training
fMRI
resting-state functional MRI
cochlear implants
title Resting-state functional connectivity changes following audio-tactile speech training
title_full Resting-state functional connectivity changes following audio-tactile speech training
title_fullStr Resting-state functional connectivity changes following audio-tactile speech training
title_full_unstemmed Resting-state functional connectivity changes following audio-tactile speech training
title_short Resting-state functional connectivity changes following audio-tactile speech training
title_sort resting state functional connectivity changes following audio tactile speech training
topic speech comprehension
tactile aid
multisensory training
fMRI
resting-state functional MRI
cochlear implants
url https://www.frontiersin.org/articles/10.3389/fnins.2025.1482828/full
work_keys_str_mv AT katarzynaciesla restingstatefunctionalconnectivitychangesfollowingaudiotactilespeechtraining
AT katarzynaciesla restingstatefunctionalconnectivitychangesfollowingaudiotactilespeechtraining
AT katarzynaciesla restingstatefunctionalconnectivitychangesfollowingaudiotactilespeechtraining
AT tomaszwolak restingstatefunctionalconnectivitychangesfollowingaudiotactilespeechtraining
AT amiramedi restingstatefunctionalconnectivitychangesfollowingaudiotactilespeechtraining
AT amiramedi restingstatefunctionalconnectivitychangesfollowingaudiotactilespeechtraining