An algorithm that can accurately gauge heart rate by measuring tiny head movements in video data could ultimately help diagnose cardiac disease.
MIT scientists developed a key experiment to investigate human visual orientation in weightlessness as part of the Neurolab mission aboard space shuttle Columbia. The mission was launched from the Kennedy Space Center on April 17 and landed May 3.
The investigation focused on how astronauts utilize vision, the vestibular organs for the inner ear and pressure cues to determine their own orientation relative to their surroundings. The results are expected to offer important advances in the prevention of disorientation in astronauts, and in the diagnosis and rehabilitation of balance disorders which may affect millions of Earth-bound Americans. The mission is scheduled to fly again on the space station in late 1999.
The principal investigator for the "Role of Visual Cues in Spatial Orientation" experiment is Dr. Charles M. Oman, director of the Man Vehicle Laboratory in the Department of Aeronautics and Astronautics. Co-investigators are Professor Ian Howard of York University and Ted Carpenter-Smith, a former MIT colleague who is now with Andersen Consulting. Other MIT participants include Andy Beall, a sponsored research technical staff member in the Center for Space Research; research scientist Alan Natapoff; graduate students Christine Tovee and Bill Hutchison; and Hilda Gutierrez, a sophomore in mechanical engineering.
The experiment uses the Virtual Environment Generator (VEG), a head-mounted, computer-driven virtual reality display developed at NASA's Johnson Space Center. The VEG system includes a 3D graphics workstation, a helmet-mounted display that provides a wide field of view, and a head tracker that measures movement of the head and a joystick.
The system can be used in either a free-floating mode or with a constant-force spring harness to create downward tactile restraint cues to shoulders and hips.
"On Neurolab, we will be using immersive virtual reality (VR) techniques in space for the first time," Dr. Oman said. "The VR helmet gives us complete control of the visual stimulus."
Three separate investigations will be performed. The first will investigate how the content of the visual scene and its symmetry influence the subject's perception of up or down. The second will determine how a moving visual scene produces the feeling of self-motion.
The third will study how the perception of a given "down" direction alters the subject's ability to recognize complex figures and interpret shading. All three experiments were performed with the astronauts free-floating and then restrained to provide an artificial "down" direction as a reference perspective.
Dr. Oman said the sustained weightless environment of Neurolab "provides a truly unique opportunity" to understand the interaction between visual and gravity cues in the spatial orientation of human beings.
One part of the experiment had its origins in pioneering work performed on four earlier Spacelab missions under the direction of Professor Larry R. Young, the Apollo Program Professor of Astronautics and director of the National Space Biomedical Research Institute. These experiments involved the use of a mechanically rotating dome which created the illusion of self-motion in the opposite direction. The Virtual Environment Generator employed in Neurolab offers much more advanced capabilities to render complex scenes, including linear motion, and provides an important new tool for studying spatial orientation in space flight.
NASA is giving serious consideration to reflying the Neurolab mission in August due to a potential opening in the space shuttle launch manifest caused by delays in the launch of the International Space Station and the Advanced X-Ray Astrophysics Facility.
(John Tylko, SB '79 in aeronautics and astronautics, is a special correspondent for MIT Tech Talk.)
A version of this article appeared in MIT Tech Talk on May 6, 1998.