How motor cortex involves in active inference of speech perception and decision is important to understand human language intelligence, which may inspire the next generation AI. However, three key questions regarding the role of the bilateral motor cortices in speech perception remain unanswered. Is the laryngeal motor cortex (LMC) engaged in speech perception as in production? How do bilateral motor cortices cooperate under varying degree of difficulty? What specific stages of the perceptual decision-making process are modulated by the bilateral motor cortices?
In a study published in Nature Communications on August 5, a research team led by Dr. DU Yi at the Institute of Psychology of the Chinese Academy of Sciences found that the bilateral LMC in the human brain are causally engaged in multiple stages of speech perception decision-making. The LMC engagement is left-dominant and particularly aids auditory processing in scenarios with perceptual challenges.
To address these issues, the researchers conducted two experiments. They delivered repetitive transcranial magnetic stimulation (rTMS) in Experiment 1 and theta-burst stimulation (TBS) in Experiment 2 to the left/right motor cortex (in Experiment 1, LMC and tongue motor cortex, TMC; in Experiment 2, LMC only) of healthy Mandarin adult speakers to investigate if the categorical perceptual decision of lexical tone (a suprasegmental lexical cue determined by laryngeal gestures) and plosive consonant ([t]/[th], segmental lexical cues determined by both laryngeal voicing and tongue motions) with/without noisy background would be modulated accordingly. To localize the production-related dorsal LMC (dLMC) and TMC, participants underwent a functional magnetic resonance (fMRI) pretest where they performed phonation and tongue movement tasks.
The researchers applied two independent data-analysis pipelines to investigate the TBS modulation effects upon dLMC on behavioral responses in Experiment 2. To detect changes in perceptual sensitivity, they fitted psychometric curves to investigate modulations on the curve slope. To evaluate changes at specific stages of perceptual decision, they applied the hierarchical Bayesian estimation of the drift-diffusion model (HDDM) to disentangle what latent dynamic decision processes would be altered.
Psychometric curve slope analyses showed that cTBS upon bilateral dLMC affected both tone and consonant perception：cTBS upon the left dLMC inhibited tone perception in noise; cTBS upon the left dLMC inhibited consonant perception in both quiet and noise; cTBS upon the right dLMC inhibited consonant perception in noise but not in quiet, and did not affect tone perception. HDDM analyses showed that, for all conditions (except tone perception in quiet by left dLMC stimulation), cTBS significantly broadened the boundaries of decision-making (a); cTBS upon left dLMC affected evidence accumulation rates (v) but right dLMC stimulation failed to exert similar effects; cTBS upon both left and right dLMC affected response biases (z) for consonant perception in noise.
The results reveal an effector-specific involvement of bilateral dLMC in perceptual decision of both lexical tone and voicing of plosive consonant, suggesting that the human LMC is causally engaged in speech perception as in speech production. Meanwhile, they provide evidence for the redundancy and functional re-organization of neural networks, as the left dLMC plays a dominant role, while the right counterpart is only crucial in challenging tasks. Moreover, the specific perceptual decision stages that are modulated by the dLMC hinge on the hemisphere and task difficulty.
"In speech perception, our articulatory motor cortex acts like a denoiser that predicts the upcoming words by simulating the embedded motor gestures. Our study provides important empirical evidence to support that bilateral laryngeal motor cortices, the motor subregions essential for voicing and pitch control, are also parts of such denoiser systems.” Said Dr. DU, the corresponding author of this study.
These findings expand our knowledge of the underlying mechanisms and temporal dynamics of bilateral motor engagement in speech perceptual decision-making. Moreover, this study holds implications for the clinical translational research for speech disorder rehabilitation, as well as for the development of more robust AI algorithms with dynamical adaptivity.
Figure 1. Motor cortex helps restoring auditory speech representation in noise.Inspired by archaeology, the "relics" of syllables (i.e., auditory representations) are placed on the "land" of the auditory cortex. The "dust" (i.e., noise interfering with the input sound signals) blurs the auditory representations of syllables; and the way the cerebral speech motor system (the human head) is involved in auditory processing is similar to that of an archaeologist: matching its stored "motor templates" with the buried syllable representations, top-down assisting the auditory cortex to brush off the dust, and restoring the original content produced by the speaker. Image by WANG Shiyu.
Institute of Psychology Chinese Academy of Sciences
Beijing 100101, China.