Summary
Conversations often take place in noisy environments like a crowded coffee shop. In the European community, many of these conversations are among bilinguals and in a second language. Although adverse listening conditions have little impact on comprehension in one’s native language, communication can be severely affected in a bilingual’s second language. The neural systems involved in spoken language and the mechanisms that facilitate comprehension have been extensively investigated in monolinguals. However, much is unknown about the neural architecture that supports language comprehension in bilinguals. Bilinguals have, not one, but two systems that map sound-to-meaning. Here we investigate how the two languages interact with one another and how that affects access to meaning under naturalistic listening conditions. A combination of complementary methodologies — behavior, fMRI, and ERP/EEG — will be used to characterize the nature, locus, and temporal dynamics of mapping sound-to-meaning in bilinguals. The research location also provides a rare opportunity to compare two unique balanced bilingual populations that share one common language, Basque; the two groups of bilinguals differ in the degree to which their first (Spanish or French) languages overlap in their sound structure with Basque. The project aims to bridge bilingualism and speech perception research, two rapidly growing areas of study, to provide a unifying neural framework for bilingual spoken language processing. This work will provide the basis for clinical and technological advances that support language comprehension.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/799554 |
Start date: | 01-07-2019 |
End date: | 30-06-2021 |
Total budget - Public funding: | 158 121,60 Euro - 158 121,00 Euro |
Cordis data
Original description
Conversations often take place in noisy environments like a crowded coffee shop. In the European community, many of these conversations are among bilinguals and in a second language. Although adverse listening conditions have little impact on comprehension in one’s native language, communication can be severely affected in a bilingual’s second language. The neural systems involved in spoken language and the mechanisms that facilitate comprehension have been extensively investigated in monolinguals. However, much is unknown about the neural architecture that supports language comprehension in bilinguals. Bilinguals have, not one, but two systems that map sound-to-meaning. Here we investigate how the two languages interact with one another and how that affects access to meaning under naturalistic listening conditions. A combination of complementary methodologies — behavior, fMRI, and ERP/EEG — will be used to characterize the nature, locus, and temporal dynamics of mapping sound-to-meaning in bilinguals. The research location also provides a rare opportunity to compare two unique balanced bilingual populations that share one common language, Basque; the two groups of bilinguals differ in the degree to which their first (Spanish or French) languages overlap in their sound structure with Basque. The project aims to bridge bilingualism and speech perception research, two rapidly growing areas of study, to provide a unifying neural framework for bilingual spoken language processing. This work will provide the basis for clinical and technological advances that support language comprehension.Status
CLOSEDCall topic
MSCA-IF-2017Update Date
28-04-2024
Images
No images available.
Geographical location(s)