We aim to reconstruct the 'view from the cockpit' of navigating insects with the aim of testing navigation algorithms.
To understand information processing under natural conditions remains one of the biggest challenges for Neuroscience. As far as image processing is concerned, one possible approach is to attempt to reconstruct what freely behaving animals see in real life. In this project we develop methods of modelling the three-dimensional structure of natural scenes, including their detailed textures, using laser-scanner and camera-based scene-reconstruction methods. This allows us to reconstruct the natural visual input experienced by freely flying and walking insects performing tasks like homing in complex natural environments.
Applicants should have expertise and interest in neuroethology, computer vision and robotics.