Cochlear implant recipients may also struggle to adjust to the new sounds they are able to hear after receiving the implant, as their brains may have difficulty filtering out their own movement sounds at first. However, with time and practice, the brain can learn to use the cortical filter more effectively, allowing people to focus on the sounds that matter most to them.
Sounds that arise from our own movements, such as vocalizing or walking, are known as reafferent sounds. These sounds are essential for normal hearing, as they help us to anticipate and distinguish them from sounds that come from the environment. However, the neural circuits that allow us to learn to anticipate and distinguish these sounds are not well understood.
In order to study these neural circuits, researchers developed a system called acoustic virtual reality (aVR), in which mice were trained to associate a novel sound with their own movements. By using this system, the researchers were able to identify the neural mechanisms that learn to suppress reafferent sounds and study the behavioral consequences of this predictable sensorimotor experience.
The researchers found that aVR experience gradually and selectively suppressed auditory cortical responses to the reafferent frequency, in part by strengthening motor cortical activation of auditory cortical inhibitory neurons that respond to the reafferent tone. This plasticity, or ability to change in response to experience, was behaviorally adaptive, as mice that had experienced aVR showed an enhanced ability to detect non-reafferent tones during movement.
Overall, these findings suggest that there is a dynamic sensory filter in the brain that involves motor cortical inputs to the auditory cortex, and that this filter can be shaped by experience to selectively suppress the predictable acoustic consequences of movement.
REFERENCES
Schneider DM, Sundararajan J, Mooney R. A cortical filter that learns to suppress the acoustic consequences of movement. Nature. 2018 Sep;561(7723):391-5.