Project C1 - Crossmodal active perception of human speech and its implication in social learning
PIs: Prof. Dr. Dan Zhang, Prof. Dr. Bo Hong, Dr. Guido Nolte
Project C1 aims to investigate the crossmodal electrophysiological signatures of the motoric modality and the auditory modality, to actively perceive human speech and explore its implications for social learning. Our proposal for the second funding phase builds on the basis of our phase one findings but extends to a higher linguistic level relating to speech prosody and semantics. The experiments will use EEG, MEG, and intracranial EEG to record neural activities during comprehension of naturalistic speech contents. Computational language models will be incorporated for data analysis. Our results will help us reveal mechanisms of crossmodal speech recognition and learning, and provide new models for better computerized natural-language-processing systems.