Session
Session A: 9:30-11:30AM
Poster Assignment
30
Department
Molecular, Cellular, and Developmental Biology
Presenter(s)
William Ou
Mentor(s)
Michael Beyeler
Title
Gradient Navigation
Abstract
Human and animal navigation often relies on interpreting gradual changes in sensory intensity to locate a source, such as an odor, sound, or light. Prior work in fruit flies, zebrafish, and gastropods shows that even simple nervous systems can follow unidimensional sensory gradients by sampling environmental changes over time and space. Yet little is known about how humans navigate similarly minimal sensory landscapes. This project uses a custom-built virtual reality (VR) system to test how accurately humans can locate the maximum point of a luminance gradient when visual information is restricted to a single degree of freedom–a single solid color whose intensity varies with position. Participants will complete either a full-body VR navigation task or a desktop-based condition to study embodied movement, and we hypothesize that full-body VR will yield more precise and efficient navigation than desktop navigation. Thus, we hope to clarify how humans interpret sparse sensory cues.