Project C1 - Crossmodal speech representations and their implications in speech learning
PIs: Prof. Dr. Bo Hong, Dr. Guido Nolte
Project C1 will investigate the crossmodal mechanisms underlying human speech perception, recognition and representation arising not just from auditory information but also from the somatosensory modalities involved in speech articulation. The experiments will collect high-quality, high-resolution data using ECoG (direct monitoring of brain activation) from subjects during language learning tasks. The goal is to understand the crossmodal interactions (the neural oscillations and long-range coupling between temporal auditory regions and centro-frontal sensorimotor regions of the brain) that occur during speech recognition.