Show simple item record

dc.contributor.advisorReilly, Richarden
dc.contributor.authorO'Sullivan, Aislingen
dc.date.accessioned2021-01-27T20:16:56Z
dc.date.available2021-01-27T20:16:56Z
dc.date.issued2021en
dc.date.submitted2021en
dc.identifier.citationO'Sullivan, Aisling, The impact of visual speech on neural processing of auditory speech, Trinity College Dublin.School of Engineering, 2021en
dc.identifier.otherYen
dc.descriptionAPPROVEDen
dc.description.abstractWhen we listen to someone speak, seeing their face can help us to understand them better, especially when there is background noise or other people speaking at the same time. Research examining the neural processes underlying this benefit have centered on the use of isolated syllables and words with multiple repetitions. While these approaches have provided important insights, they are limited by the fact that they fail to capture the rapid dynamics of natural speech. In this thesis, we use natural speech together with electroencephalography (EEG) recordings from humans to employ recently developed analysis techniques for studying speech processing in more natural settings (Crosse et al., 2016; de Cheveigne et al., 2018) in order to investigate the impact of visual speech on auditory speech processing. Our first study shows that attention to visual speech impacts auditory speech tracking, and this effect is thought to be driven by enhanced visual cortical processing as well as multisensory interactions when the visual speech matches the attended auditory speech. We also investigated how visual speech impacts auditory spectrogram and phonetic processing by quantifying the strength of the encoding of those features in the EEG using canonical correlation analysis. We found multisensory interactions for both stages of processing and the strength of multisensory interactions were more pronounced at the level of phonetic processing for speech in noise relative to speech in quiet, indicating that listeners rely more on articulatory details from visual speech in challenging listening conditions. The final study in this thesis tested the effect of blurring the mouth while retaining the overall dynamics of the mouth. This revealed that phonetic encoding was reduced when the mouth was blurred compared with when it was clear, whereas spectrogram encoding was not affected by the mouth blurring. This suggests that the mouth details provides visual phonetic information that helps to improve understanding of the speech. Together, the finding from these studies support the notion that the integration of audio and visual speech is a flexible, multistage process that adapts to optimize comprehension based on the current listening conditions and the available visual information.en
dc.publisherTrinity College Dublin. School of Engineering. Discipline of Electronic & Elect. Engineeringen
dc.rightsYen
dc.subjectspeechen
dc.subjectmultisensory integrationen
dc.subjectEEGen
dc.titleThe impact of visual speech on neural processing of auditory speechen
dc.typeThesisen
dc.type.supercollectionthesis_dissertationsen
dc.type.supercollectionrefereed_publicationsen
dc.type.qualificationlevelDoctoralen
dc.identifier.peoplefinderurlhttps://tcdlocalportal.tcd.ie/pls/EnterApex/f?p=800:71:0::::P71_USERNAME:OSULLIA5en
dc.identifier.rssinternalid223014en
dc.rights.ecaccessrightsopenAccess
dc.contributor.sponsorScience Foundation Ireland (SFI)en
dc.identifier.urihttp://hdl.handle.net/2262/94842


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record