Neural mechanisms linking perception to action in zebrafish prey hunting behaviour

Grantholders

  • Dr Paride Antinucci

    University College London

Project summary

Every day, we rely on our senses to effectively interact with our surrounding environment. To do so, our brain has to create ‘meaningful images’ or perceptions of sensory stimuli and uses these images to make decisions on whether and how to execute appropriate sequences of actions known as ‘motor programs’. For example, the perception of an apple (round red fruit) can lead to a decision (delicious! eat it) and subsequent actions – arm extension, grabbing, arm retraction etc. Scientists have achieved a good understanding of how the brain creates neural representations, or activation patterns of brain cells, underlying the individual steps of this process. However, we are still far from understanding how these neural representations are linked together and coordinated in the brain.

I aim to start tackling this biological problem by finding out how actions are generated from perception using the larval zebrafish, a model organism that allows combined advanced technologies to monitor behaviour while brain activity is simultaneously recorded or manipulated. I aim to reveal key cellular substrates and mechanisms underlying the execution of a well-described visual behaviour – prey hunting – in  response to defined visual stimuli.