While we knew that the blind could compensate for their sensory deficit with the help of sounds reflected by their environment, a study proves once again that the human brain actually has the ability to echo like dolphins and bats.
Cetaceans and other bats are known for their ability to navigate thanks to the rebounds of the ultrasound they send. It’s echolocation. More in-depth studies seem to attest to this ability in the blind, who would thus compensate for their visual deficit. But few studies have looked at this ability in sighted people. It is difficult to know whether or not sighted people can indeed develop this same ability given their almost total dependence on visual perception. According to a recent study published in the Journal of Neuroscience, it seems so.
In an experiment involving 11 sighted people and one blind person, a team of researchers from the Ludwig-Maximilians-Universität in Munich tried to find out if sighted people were able to estimate the size of a room using clicks. self-generated like the blind. At the same time, the researchers monitored cerebral activity in the different brain regions of the eleven subjects observed using a technique based on functional magnetic resonance imaging (fMRI), thus allowing the team to analyze the neural mechanisms involved in the echolocation in sighted and blind humans.
For this study, the researchers first had to train the sighted subjects in echolocation, characterizing the acoustic properties of a real building — a small chapel with highly reflective surfaces and a long reverberation time. By taking an acoustic photograph of the chapel, the researchers were able to computer-modify the scale of this sound image, then making it possible to compress or expand the size of the virtual space at will.
Equipped with a headset consisting of a headset and a microphone, the experimental subjects were placed in the MRI scanner. They were then placed inside the virtual space and made to produce clicks with the tongue or to listen to them. The echoes corresponding to each virtual space of different size were then returned to them in the earphones. It then turned out that all the participants without exception had learned to perceive each of the differences in size, even the smallest. Moreover, the subjects were better able to assess the size of the virtual space when the clicks were « active », i.e. made by themselves, than when these were « passive » (issued by a machine).
Regarding the neural mechanisms involved, the researchers point out that the capacity for echolocation requires a high degree of coupling between the sensory and the motor cortex. The sound waves generated by the clicks of the tongue are reflected from the surroundings and then picked up by both ears, thereby activating the sensory (auditory) cortex. In sighted subjects, a noticeable activity of the motor cortex ensues, which stimulates the tongue and the vocal cords to emit new clicks. Regarding the congenitally blind participant on the other hand, the study showed that the reception of reflected sounds involved the activation of the visual cortex.
» That the primary visual cortex can perform auditory tasks is a remarkable testimony to the plasticity of the human brain explains Lutz Wiegrebe, one of the researchers involved. On the other hand, sighted subjects show only relatively weak activation of the visual cortex during the echolocation task. The researchers now plan to develop a dedicated training program that would enable blind people to learn to use tongue clicks in order to master echolocation and therefore better locate themselves in space.
Source