How Lovebirds Maneuver Rapidly Using Super-Fast Head Saccades and Image Feature Stabilization

PLoS One. 2015 Jun 24;10(6):e0129287. doi: 10.1371/journal.pone.0129287. eCollection 2015.

Abstract

Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Agapornis / physiology*
  • Animals
  • Behavior, Animal
  • Biomechanical Phenomena
  • Female
  • Flight, Animal / physiology*
  • Head / physiology
  • Humans
  • Image Processing, Computer-Assisted
  • Male
  • Neck Muscles / physiology
  • Orientation / physiology
  • Saccades*
  • Vision, Ocular
  • Visual Perception

Grant support

Human Frontier Science Program, grant RGP0003/2013, http://www.hfsp.org/ funded DK and DL. The Office of Naval Research Multidisciplinary University Research Initiatives Program, grant N00014-10-1-0951, http://www.onr.navy.mil/Science-Technology/Directorates/office-research-discovery-invention/Sponsored-Research/University-Research-Initiatives/MURI.aspx funded DL. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.