Psychologist poses new hand-eye relationship

Published: Friday, July 11, 2008 - 14:21 in Psychology & Sociology

Psychologists at Washington University in St. Louis, led by Richard A. Abrams, Ph.D., professor of psychology in Arts & Sciences, have shown that to see objects better, you should take the matter into your own hands. They have demonstrated that humans more thoroughly inspect objects when their hands are near the object rather than farther away from it. They posit that this processing exists because humans need to be able to analyze objects near their hands, to figure out how to handle them or to provide protection against them.

Recognizing that the location of your hands influences what you see is a new insight into the wiring of the brain, one that could lead to rethinking current rehabilitative therapy techniques and prosthetic design. For a stroke victim trying to regain use of a paralyzed arm, just placing the good arm next to the desired object could help the injured arm grasp it. Likewise, prosthetics could be redesigned to account for additional information flow from the hand to the brain, rather than just the brain controlling the spatial location of the prosthetic, as in today's prosthetics.

"This is the first experiment to investigate the effect of hand position on response time for a visual search task," said Abrams. "In all previous visual search experiments, subjects viewed stimuli on a display and responded by manipulating buttons on a table, where their hands were far from the stimuli. In our experiment, the subjects responded using buttons attached to the display so that their hands were next to the stimuli."

These response times were compared with those from a typical experiment where the subjects responded by pushing buttons that were far from the display, he added.

Results were published in the June issue of Cognition.

'Mind's eye'

Using an experiment called visual search, the researchers had subjects look through four or eight letters on a computer monitor, searching for either an S or an H. When they found the letter, the subjects responded by pressing one of two buttons, located either on the sides of the monitor or on their laps. The subjects' search rate was slower when their hands were on the side of the monitor than on their laps, meaning that they were slower to turn their attention from one item to the next.

Abrams interprets the results to mean that there is an inherent mechanism in the human brain that forces us to move our attention, or "mind's eye," more slowly from one object to the next when the objects are near our hands. We are required to evaluate these objects more carefully because they are the only candidates for manipulation or possible harm.

Not only do we see items differently when our hands are near them, we also process the meaning of language differently, Abrams said. Other experiments showed that when reading nonsensible sentences, subjects were less likely to realize that the sentences were illogical when their hands were near the display. This means that humans are less likely to recognize errors when we are holding the something we are reading.

"If you are reading for enjoyment, put your hands near the display, " he said. "You are going to be less likely to notice, and be distracted by, typographical errors or anomalous sentences. If you are proofreading, put your hands in your lap because you're more likely to detect those incorrect sentences."

Abrams compares this new mode of information processing to the robotic arm on a space shuttle. The camera on the end of the arm sends an image to the operator about its surroundings, allowing the operator to guide the arm into position. "

The engineers who designed the arm knew that positioning it would be easier if they had the camera right in hand," he said. "What we didn't know until now was that humans have a mechanism for doing this, too."

Source: Washington University in St. Louis

Share

Latest Science Newsletter

Get the latest and most popular science news articles of the week in your Inbox! It's free!

Check out our next project, Biology.Net