Impaired neural encoding of naturalistic audiovisual speech in autism
Visual cues from a speaker’s face can significantly improve speech comprehension in noisy environments through multisensory integration (MSI)—the process by which the brain combines auditory and visual inputs. Individuals with Autism Spectrum Disorder (ASD), however, often show atypical MSI, particu...
Saved in:
| Main Authors: | Theo Vanneau, Michael J. Crosse, John J. Foxe, Sophie Molholm |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-09-01
|
| Series: | NeuroImage |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S1053811925004008 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Synchrony perception of audiovisual speech is a reliable, yet individual construct
by: Liesbeth Gijbels, et al.
Published: (2025-05-01) -
Automatic understanding of acoustic speech signal pathology
by: Wiesław WSZOŁEK
Published: (2014-04-01) -
Selected methods of pathological speech signal analysis
by: W. WSZOŁEK
Published: (2014-04-01) -
Vibrotactile speech cues are associated with enhanced auditory processing in middle and superior temporal gyri
by: Alina Schulte, et al.
Published: (2025-07-01) -
Advancing automatic speech recognition for low-resource ghanaian languages: Audio datasets for Akan, Ewe, Dagbani, Dagaare, and IkposoScience Data Bank
by: Isaac Wiafe, et al.
Published: (2025-08-01)