Coordinating movement in a complex world: How the midbrain and oculomotor cerebellum encode visual motion originating from realistic scenes to guide locomotion.

As we move about the world, we experience optic flow – the movement of surfaces and objects resulting from self-motion. Studies of human behavior have shown that optic flow is critical for controlling posture, walking, driving, and navigating complex environments. Deficits in optic flow processing are linked to diseases including vertigo, oscillopsia, ataxias, Parkinson’s disease, and Alzheimer’s disease. Determining how and where the brain processes optic flow is therefore crucial to human health and behavior, but major gaps in knowledge remain. Typically, optic flow processing is studied by exposing subjects to simple patterns. These methods allow for tight control of experimental designs, but simple patterns lack features provided by the real world – features we use every day. How and where the brain encodes realistic visual motion to control our movement is almost entirely unknown. This severely limits our ability to treat those with optic flow deficits. This proposal aims to understand how and where the brain processes visual motion originating from realistic scenes using pigeons as a model system.