Look Ma, No Mouse - How much more productive would you be if you could have meaningful conversations with your computer? (8/8/00)
Elsewhere on the web
History of Virtual Reality
The Joy of Visual Perception: A Web Book
Open Virtual Reality Testbed at NIST
Biorobotic Vision Laboratory at the Australian National University
University of Rochester Center for Visual Science
Its a problem most of us never think about, but scientists have long wanted to know how we manage to walk around without running into things.
As this ScienCentral News video report shows, researchers at Brown University used virtual reality helmets to explore how our eyes and mind work together to keep us on target.
Virtual Reality and Vision
Comparing a walk through Grand Central Station at rush hour to playing a video game is not a difficult leap to make. Doing both, says Bill Warren, Professor of Cognitive Sciences at Brown University, "You have to navigate your way through a very complicated world, with objects flying around and obstacles you have to avoid and pitfalls you can fall into... all using visual information." Yep, definitely Grand Central. He might add, "and you might not survive."
Which is probably why, when virtual reality was introduced in the late 1980s, it quickly became the tool of choice for studying how we humans so naturally use our vision to control our locomotion, an amazing feat thats added years to our survival. Understanding human motion perception and navigation will be key to building better robots that can do it on Mars, as well as aiding humans with certain visual impairments.
Recently, researchers in this field have used VR to study how we catch fly balls and how to make driving safer with intelligent rear-view mirrors.
Warren, at the worlds largest VR facility, the Virtual Environment Navigation Laboratory, or VENLab, wants to know how we walk around without bumping into things. What sort of visual information is most important?
That depends on what strategy we use for getting across that busy terminal. There are two main contenders. When you aim at a target and move toward it, you are using what is termed visual direction. When you navigate using the direction of motion of objects you are passing, that is called optic flow.
But the two strategies cant be separated out and controlled in the real world. So, using virtual reality, "We pitted the two against each other," Warren says. In the 40-square-foot VENLab, people wearing VR headsets can walk around while immersed in a simulation. Sensors in the halls ceiling tracked their paths as they tried to aim toward a target while researchers controlled how many passing objects they encountered.
It turned out that the more optic flow information was added, the more important it became. "When all people could see was the target, they walked in the direction of the target," he says. "But as we added more motion into the display and the world got more complicated... then the optic flow dominated."
"In other words," Warren says, "people are doing both."
And "its a good thing," he adds. Using two strategies makes us "more robust in different environments." Robots on Mars should be so robust .
The research was reported in the February 2001 issue of the journal Nature Neuroscience.