Feeling is believing. A system that uses sound waves to project "haptic holograms" into mid-air – letting you touch 3D virtual objects with your bare hands – is poised to bring virtual reality into the physical world.
Adding a sense of touch as well as sight and sound will make it easier to completely immerse yourself in VR. And the ability to feel the shape of virtual objects could let doctors use their hands to examine a lump detected by a CT scan, for example. What's more, museum visitors could handle virtual replicas of priceless exhibits while the real thing remained safely behind glass.
Ben Long and his colleagues at the University of Bristol, UK, improved on a previous version of their UltraHaptics technology, which projected 2D outlines of map contours, for example, onto a screen. Now, high-frequency sound waves emitted by an array of tiny speakers create the sensation of touching an invisible, floating object. When the sound hits the hand, the force of the waves exerts pressure on the skin.
To make the jump from outlines to full shapes, the team added a Leap Motion sensor to track the precise position of a user's hands. Knowing where the hands are in relation to the virtual object means the system can direct ultrasound at the right time and frequency to produce the sensation of touching different parts of the object – the top, say, or the side. This creates the impression that you are exploring the surface of an object as you move your hands around in empty space.
"Without haptics it's like you're in a dream and you cannot feel the environment," says Sébastien Kuntz of I'm in VR, VR developers in Paris, France. "You can only look at it, you don't have any feedback."
So far, the researchers have tested several shapes, including spheres and pyramids. They appear to be gently vibrating in space, says Long. The level of detail in the virtual objects is limited, but using more, smaller, speakers should improve the resolution of what can be projected, says Long. The shapes do not need to be perfect to conjure an immersive experience, though. "Even if there are discrepancies, the brain will bend what it sees and feels to fit the overall picture," says Kuntz.
The team says it has already been approached by companies interested in developing the technology for commercial applications. The work will be presented at interactive tech conference SIGGRAPH Asia in Shenzhen, China, on 3 December.
Stuart Cupit, technical director at Inition, a design studio in London, is also impressed by the technology. "Touch is a missing element in virtual interfaces today," he says.
This article will appear in print under the headline "Touching the void"
- Subscribe to New Scientist and you'll get:
- New Scientist magazine delivered every week
- Unlimited access to all New Scientist online content -
a benefit only available to subscribers - Great savings from the normal price
- Subscribe now!
If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to.