Our overall research goal is to elucidate computational principles that allow insects, such as flies and bees, to autonomously generate their virtuosic visually guided behaviour and to adapt this behaviour to the demands of dynamic, cluttered, and often unpredictable environments.
Autonomous behaviour implies a range of fundamental capabilities that may well generalize across species and are even relevant for autonomous artificial systems: Particularly relevant is the ability to identify objects in the environment as being behaviourally relevant, to take decisions between several behavioural alternatives, and to perform appropriate actions. Moreover, in many contexts it may be highly relevant to be able to find a goal or to return to behaviourally relevant locations - often over large distances -, such as the animal’s nest. Solving such tasks requires the animal to gain, process and potentially learn and retrieve specific information about its environment and, especially, about the spatial layout of its surroundings.
To accomplish their extraordinary performance in spatial vision tasks, flies and bees have been shown to shape the dynamics of the image flow on their eyes (“optic flow”) by their characteristic behavioural actions. They appear to acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. Here, adaptive processes of a wide range of complexity play a key role. They operate on a wide range of timescales from milliseconds to hours and comprise adaptive changes in the sensory system as well as genuine learning processes, depending on the behavioural task. In this way, even animals with tiny brains, such as insects, are capable of performing extraordinarily well by making optimal use of the closed action–perception loop.
For achieving our research goals, we apply quantitative behavioural approaches, such as high-speed cinematography of unrestrained behaviour or walking simulators in virtual reality environments. Moreover, electrophysiological approaches are employed to resolve the underlying neuronal mechanisms. For visual stimulation we use artificial stimuli, but also image sequences that reflect what animals have seen during unrestrained locomotion in different behavioural situations (‘ego-perspective movies’). This facilitates interpreting computational properties of neurons and neuronal networks in a behavioural context. Our experimental analysis is complemented by computational modelling and by simulation of biological principles of information processing in virtual environments. Being a constituent element of most of our projects, these simulations show that the smart biological mechanisms of motion computation and visually guided orientation behaviour might also be helpful when designing technical systems.
In several of our projects we join forces with colleagues from CITEC but also from research institutions all over the world.