Grant Number: 5R01EY016178-02
Project Title: Sensory Integration for Heading Perception
PI Information: ASSOCIATE PROFESSOR GREGORY C. DEANGELIS,
[email protected]
Abstract: DESCRIPTION (provided by applicant):
To navigate through our surroundings, we must accurately perceive our
direction of self-motion (i.e., heading). Heading perception is an
interesting problem in sensory integration, requiring neural circuits
that combine visual motion signals (optic flow) with vestibular signals,
and perhaps also somatosensory and proprioceptive cues. The
multi-sensory nature of heading perception can be appreciated by the
experience of section, the powerful illusion of self-motion that
accompanies large-field visual motion (e.g., at an IMAX theater).
Although the processing of optic flow has been well studied in visual
and parietal cortices, little is known about how or where visual and
vestibular signals are integrated for heading perception. Areas MSTd and
VIP appear to be promising candidates, for these areas are known to be
involved in processing optic flow and have also been found to contain
vestibular signals regarding head translation. The proposed experiment,
which employ a custom-designed virtual reality system, address three
specific aims regarding the neural basis of heading perception in
trained primates. Aim #1 examines the relative contributions of visual
and vestibular cues to heading selectivity in MSTd/VIP. Specifically, we
test whether the heading activity of neurons is enhanced by congruent
combinations of visual and vestibular cues. Aim #2 tests whether heading
signals derived from visual and vestibular cues are coded in a common
reference frame (eye-centered, head-centered, or intermediate), as might
be expected if these different sensory signals, are combined
synergistically to improve heading selectivity. In Aim #3, we test more
directly whether MSTd and VIP contribute to heading perception by
recording from neurons during performance of a heading discrimination
task. Monkeys will perform this task using optic flow alone, vestibular
signals alone, or congruent combinations of the two cues. This will
allow us to test whether MSTd/VIP neurons can account for the
improvement in heading sensitivity seen under cue combination. These
experiments will provide a comprehensive examination of whether MSTd/VIP
neurons are involved in sensory integration for heading perception. Of
clinical relevance, heading perception can be severely impaired in
Alzheimer's disease, and this may contribute to spatial disorientation
and navigational difficulties. By helping to elucidate the brain areas
involved in heading perception, this work may eventually aid in
targeting new Alzheimer's therapies to the appropriate brain regions.
Thesaurus Terms:
motion perception, neural information processing, visual perception
cue, psychophysics, sensory discrimination
Macaca fascicularis, Macaca mulatta, behavioral /social science research
tag, computer human interaction, computer simulation, saccade, single
cell analysis
Institution: WASHINGTON UNIVERSITY
1 BROOKINGS DR, CAMPUS BOX 1054
SAINT LOUIS, MO 631304899
Fiscal Year: 2006
Department: ANATOMY AND NEUROBIOLOGY
Project Start: 01-AUG-2005
Project End: 31-JUL-2007
ICD: NATIONAL EYE INSTITUTE
IRG: CVP The Journal of Neuroscience, January 17, 2007, 27(3):700-712
Spatial Reference Frames of Visual, Vestibular, and
Multimodal Heading Signals in the Dorsal Subdivision of the Medial
Superior Temporal Area
Christopher R. Fetsch, Sentao Wang, Yong Gu, Gregory C. DeAngelis, * and
Dora E. Angelaki *
Department of Anatomy and Neurobiology, Washington University School of
Medicine, St. Louis, Missouri 63110
Animal preparation.
Subjects were three male rhesus monkeys (Macaca mulata) weighing 4�7 kg.
Under sterile conditions, monkeys were chronically implanted with a
circular delrin cap for head stabilization as described previously (Gu
et al., 2006a ), as well as one or two scleral search coils for
measuring eye position (Robinson, 1963 ; Judge et al., 1980 ). After
surgical recovery, monkeys were trained to fixate visual targets for
juice rewards using standard operant conditioning techniques. Before
recording experiments, a plastic grid (2 x 4 x 0.5 cm) containing
staggered rows of holes (0.8 mm spacing) was stereotaxically secured to
the inside of the head cap using dental acrylic. The grid was positioned
in the horizontal plane and extended from the midline to the area
overlying the MSTd bilaterally. Vertical microelectrode penetrations
were made via transdural guide tubes inserted in the grid holes. All
procedures were approved by the Institutional Animal Care and Use
Committee at Washington University and were in accordance with National
Institutes of Health guidelines.
Heading stimuli.
During experiments, monkeys were seated comfortably in a primate chair
with their head restrained. The chair was secured to a
6-degrees-of-freedom motion platform (MOOG 6DOF2000E; Moog, East Aurora,
NY) (see Fig. 1A) that allowed physical translation along any axis in 3D
(Gu et al., 2006a ). Visual stimuli and fixation targets were
back-projected (Christie Digital Mirage 2000; Christie, Cyrus, CA) onto
a tangent screen positioned 30 cm in front of the monkey and subtending
90� x 90� of visual angle. Optic flow was generated using the OpenGL
graphics library, allowing the accurate simulation of speed, size, and
motion parallax cues experienced during real self-motion. The stimuli
depicted movement of the observer through a random cloud of dots plotted
in a virtual workspace 100 cm wide, 100 cm tall, and 40 cm deep. Stimuli
were viewed binocularly with no disparities added to the display (i.e.,
no stereo cues were present). The projector, screen, and field coil
frame were mounted on the platform and moved along with the animal, and
the field coil frame was enclosed such that the animal experienced no
visual motion other than the optic flow presented on the screen.
|