How the Brain Locates Audio Targets

A recent study from Duke University has shown how our brains process the location of visual and audio stimuli differently. The write-up of the paper at Futurity.org describes the previous model as a kind of ‘zone defense’, where different neurons in the brain are responsible for a different location in space. Which neurons are firing in response to a stimulus can tell you where in space the stimulus originated. The researchers discovered however, that while that’s true for visual stimuli, it’s not the case for audio. In the case of audio stimuli, it’s not which neurons are firing that maps to the location in space, but rather how strongly they are firing.

Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.

This means that the same neurons are used for both visual and audio location at the same time, by employing two very different methods. The image below shows the difference graphically.

Represenation of Neuron Activity showing the differences between visual (appears as a hill) vs audio (all neurons participate at varying intensity)

Difference in activity between visual and audio targets – Lee J, Groh JM (2014) Different Stimuli, Different Spatial Codes: A Visual Map and an Auditory Rate Code for Oculomotor Space in the Primate Superior Colliculus. PLoS ONE 9(1): e85017. doi:10.1371/journal.pone.0085017

The researchers tested using Rhesus monkeys responding to various audio (loud sounds) and visual (bright flashing lights) signals and tracking the responses in brain as their eyes tracked the targets. The research applies only to targets in the horizontal plane, so it doesn’t address how vertical targets are tracked. The likelihood is that a very different mechanism is used, since horizontal location is primarily ILD/ITD based (Interaural Level Difference/Interaural Time Difference) whereas vertical location relies on spectral filtering cues.

Finally, it should be noted that here we have only varied the horizontal component of sound location. How SC neurons signal the vertical component of sound location is unknown. Vertical information derives exclusively from position-depending differences in the frequency filtering properties of the external ear, known as spectral cues. Unlike interaural timing or level differences which vary monotonically with sound azimuth, the relationship between spectral cues and sound elevation is complex. The SC and other auditory-responsive structures may therefore use a quite different method for encoding this dimension of the auditory scene.

Citation: Lee J, Groh JM (2014) Different Stimuli, Different Spatial Codes: A Visual Map and an Auditory Rate Code for Oculomotor Space in the Primate Superior Colliculus. PLoS ONE 9(1): e85017. doi:10.1371/journal.pone.0085017

Share this post
  , , ,


Leave a Reply

Your email address will not be published. Required fields are marked *