Mapping audiovisual integration across learning, circuits, and behaviours

Year of award: 2022

Grantholders

  • Dr Philip Coen

    University College London, United Kingdom

Project summary

Combining auditory and visual information to interpret the external environment is vital; whether prey, predator or pedestrian. However, despite the ubiquity of this audiovisual integration, the underlying brain regions and circuits remain largely unclear. This is partly because the interpretation of audiovisual signals may be confounded by learning, behavioural context or animal movements. I propose to combine behaviour, electrophysiology and optogenetics to address these problems. First, I will generate a brain-wide map of audiovisual signals before, during and after mice learn an audiovisual localisation task using high-throughput chronic electrophysiology. This will identify which brain regions combine auditory and visual information, whether this changes throughout learning and the neural computations involved. Second, I will causally test which intracortical and subcortical projections are required for behaviour with a novel combination of non-invasive laser stimulation and inhibitory opsin expression. Finally, I will determine which of the identified audiovisual computations and circuits are task-specific and which represent general mechanisms that apply to other tasks, like audiovisual navigation. The results will identify fundamental principles of audiovisual integration that:

- generalise across behaviours

- are likely applicable to other sensory combinations

- provide a foothold to understand why failures in this process are associated with cognitive disorders.