How does the brain map sounds into the world?
Year of award: 2023
Grantholders
Dr Jennifer Bizley
University College London, United Kingdom
Project summary
In hearing, spatial information must be computed from interaural cues, from which the brain reconstructs a 3D scene. We are readily able to switch between describing a sound's position relative to ourselves ('head-centered'), or relative to external landmarks ('world centered'). Here we test the hypothesis that mapping sounds into a world-centered reference frame is a key function of auditory cortex (AC). In Aim 1, we will train animals in a world-centered localisation task before making key manipulations to test how world-centered receptive fields are constructed and anchored by visual cues, and how tuning is established in novel environments. In Aims 2 and 3 we will employ a sensory-guided navigation task in which animals 'hunt' sounds (Aim 2) or audiovisual stimuli (Aim 3) in a large arena. By preserving the natural timing relationships between perception and action and measuring head and eye movements, we will define the active sensing strategies employed by animals and how sensory and motor components shape AC activity. Using optogenetics we will establish which elements of sound-guided action are supported by AC. In Aim 4 we will combine neural recordings and pathway-specific manipulation of activity to determine what brain regions support AC in mapping sounds into space.