Whenever one has to watch shaky images recorded by an unsteadily held camera, one can feel nauseous. Does one ever consider how the brain restores an almost continuous shaky video stream from our eyes which is for ever darting back and forth taking also into consideration our bodies which is in constant motion at any given point of time?
Researchers at the Salk Institute for Biological Studies have found that the brain not only finds away to compensate for a constantly wavering gaze, but also, surprisingly relies on these same flickering eye movements to recognize partially concealed or moving objects. The forthcoming issue of Nature Neuroscience will carry details of their research.
"You might expect that if you move your eyes, your perception of objects might get degraded," explains senior author Richard Krauzlis, Ph.D., an associate professor in the Systems Neurobiology Laboratory at the Salk Institute. "The striking thing is that moving your eyes can actually help resolve ambiguous visual inputs."
Just like high-end video cameras, the brain relies on an internal image stabilization system to prevent our perception of the world from turning into a blurry mess. Explains lead author Ziad Hafed, Ph.D. "Obviously, the brain has found a solution. In addition to the jumpy video stream, the visual system constantly receives feedback about the eye movements that the brain is generating."
Hafed and Krauzlis took the question of how the brain is able to maintain perception under less than optimal circumstances one step further. "If you think of the video stream as a bunch of pixels coming in from the eyes, the real challenge for the visual system is to decide which pixels belong to which objects. We wondered whether information about eye movements is used by the brain to solve this difficult problem," says Hafed, who is an NSERC ( Canada) and Sloan-Swartz post-doctoral researcher at the Salk Institute.
Krauzlis explains that the human brain recognizes objects in everyday circumstances because it is very good at filling in missing visual information. "When we see a deer partially hidden by tree trunks in a forest, we can still segment the visual scene and properly interpret the individual features and group them together into objects," he says.
However, even though recognizing that deer is effortless for us, it is not a trivial accomplishment for the brain. Teaching computers to recognize objects in real life situations has proven to be an almost insurmountable problem. Artificial intelligence researchers have spent much time and effort trying to design robots that can recognize objects in unconstrained situations, but so far, their success has been limited.
To determine whether eye movements actually help the brain recognize objects, Hafed and Krauzlis asked whether people perceived an object better when they actively moved their eyes or when they stared at a given point in space. Human subjects watched a short video that allowed them to glimpse a partially hidden chevron shape that moved in a circle.
When they kept their eyes still by fixating on a stationary spot, observers perceived only random lines moving up and down. But when they moved their eyes such that the input video streams through them were unaltered, viewers easily recognized the lines as a circling chevron.
"It turns out that eye movements not only help with image stabilization, but that this additional input also plays a fairly important role for the perception of objects in the face of all the challenges that real life visual scenes pose - that objects are obscured or are moving, and so on," says Hafed.