Projects:


ELECTROPHYSIOLOGY

 

(iii) Visual and Auditory Space Representation













The demands of the environment frequently change. These changes frequently imply that the sensory quality of the most reliable spatial information also changes. In one second we might localize an object based on sight but in sudden darkness the sound of the object becomes a more suitable localization source. To efficiently guide behaviour our brain must be able to switch between spatial information of different sensory systems, e.g. by creating a more abstract representation of spatial positions regardless of the sensory modality of the original information source. The parietal cortex is a key structure for visual space representations and additionally contains multisensory areas. One such area is the ventral intraparietal area (VIP). Area VIP has been shown to get input from auditory areas, yet nobody has ever physiologically established auditory responsiveness in this cortical area. In the present study, we are the first to show that VIP neurons respond to auditory stimulation. We mapped auditory and visual receptive field (RF) locations using the setup described below (see Methods).  We found that most of the auditory responsive neurons had a spatially distinct RF within the central 60 by 60 degrees of frontal extrapersonal space. Comparing the auditory RF locations to the respective visual ones, we found that the RFs of most of the neurons largely overlapped. Note, that the original reference frames in which visual and auditory signals are encoded are quite different. Yet, we show, that in area VIP both, visual and auditory space is represented in a continuum between eye and head centered reference frames. Downstream areas could use this multisensory spatial information in the reference frame best suited for the actual behavioural demands.


METHODS:


(A) visual stimulation

(B) auditory stimulation




During the visual stimulation the monkey was required to fixate a target (a red dot) that was positioned either in the center of the screen or 10 degrees to the left or to the right of it. The mapping range was divided into a vitual square grid of 36 patches, each beeing 10 by 10 degrees wide. The visual stimulus consisted of a white bar moving into the preferred visual motion direction of the respective neuron. This stimulus was positioned in a randomised order at the 36 patches (see movie above). Receptive field maps were constructed in an offline analysis by averaging the number of spikes evoked by the stimulation of a given grid patch.




For the auditory receptive field mapping we had previously determined the head related transfer function (HRTF) of the individual monkeys. This allowed us to compute a stimulus catalogue of auditory stimuli, which (if presented over calibrated earphones as indicated in the movie) would simulate free field stimuli arising from different directions of the monkey's extrapersonal space. We chose to simulate auditory stimulation from the same directions as in the visual mapping paradigm described on the left. This allowed us to compute auditory receptive fields in the same mapping range and manner as for the visual domain.



(This study was done together with Prof. Frank Bremmer , Dr. Susanne J. Sterbing-D'Angelo, Dr. Klaus Hartung and Prof. Klaus-Peter Hoffmann at the Department of Zoology and Neurobiology (Ruhr-University Bochum, Germany))