Tackling the biggest problems in science takes a whole community.
Health scientist Lauren Sergio, virtual reality scientist Robert Allison, and neuroscientist Laurence Harris are all researchers at York University’s Vision: Science to Applications (VISTA) program, and their diverse expertise on vision research is what drives their successful collaborations.
“When you’re doing something as complicated as trying to understand how the brain controls movement, there’s no way you can do that on your own,” says Sergio.
With her expertise in movement science, and Allison’s background in engineering and vision science, they are able to look at visuomotor issues together. They also work in both human vision and computer vision, where discoveries on either front illuminates unknowns on the other.
“Our work is mostly on the behavioural response: how we perceive depth in the world around us, our self-motion in the world around us,” adds Allison. “But in order to ground that in terms of biology we need to work with neurophysiologists, people who do brain imaging, psychologists.”
Sergio and Allison are working on a system to assess mobility, which can be affected by everything from aging, to brain injury and disease.
“We’ve got this really cool beta going that’s going to integrate computer vision and computer vision algorithms to objectively assess people’s mobility, using basically a Kinect system that we hacked into, and again it’s the sort of thing where we need to have that sort of collaboration or it just wouldn’t get done,” says Sergio.
Their system uses the Kinect to visualize a user’s movement objectively, collecting data on speed, position, and direction as users execute different movements, and using computer vision to evaluate that motion for dysfunction.
But there’s more to visuomotor research than just everyday movement, like walking or sitting. There are also lots of extreme cases of vision and perception, and Allison and Harris also collaborate on simulating the perception of self-motion in outer space, where perception of speed and position don’t always match reality.
“My research is very interdisciplinary. I need vast pieces of equipment that are able to move people around, and fancy visual displays. All the space research is based on an HMD system, a virtual reality-type system,” says Harris.
Harris brings the biology and psychology side to the table, while Allison brings expertise in computer science, virtual reality, and display technology. But both the computer and human sides are vital to their collaboration.
“If we have an understanding of how people interact, and the human side of this equation, then we can make much more compelling content. We can make better engineering decisions,” says Allison.
Designing displays always involves trade-offs, but understanding human perception helps create devices with lower cost and complexity, while maintaining the biggest impact where it matters most.
“Technology is steamrolling ahead,” adds Allison. “The technical advances are going to be there. It’s really up to us to see what we’re going to be able to do with them.”