Researchers at the California Institute of Technology developed an application for Microsoft’s Hololens that can steer visually impaired individuals through a complex building. Rather than deliver raw images to the brain as seen in recent prosthetic attempts, this “non-invasive” method relies on 360-degree sound and real-time room/object mapping to navigate wearers through an unfamiliar multi-story building on their first attempt.
Typically, Hololens renders interactive virtual objects in your full view of the real world. For example, engineers can construct a 3D model of a building in physical space and examine each side by simply walking around the virtual structure. You can also use the Hololens to shop for furniture online by placing a 3D model of the desired chair or table in your living room to see how it blends in with your current décor before making a purchase.
The drawback to Hololens, for now at least, is that all virtual objects reside only in the wearer’s view; these “holograms” can’t be seen by anyone else unless they have a device capable of sharing the same experience. In this case, the wearer can’t see anything, so the researchers fell back on the headset’s real-time room and object-mapping capabilities.
“Our design principle is to give sounds to all relevant objects in the environment,” the paper states. “Each object in the scene can talk to the user with a voice that comes from the object’s location. The voice’s pitch increases as the object gets closer. The user actively selects which objects speak through several modes of control.”
These modes consist of scan, spotlight, and target. After selecting scan mode using a clicker, each object will call out its name in sequence from left to right via spatial audio, meaning the wearer can get a sense of their real-world placement based on the distance and location of their voice. Spotlight mode forces the object directly in front to speak, and target mode will force an object to repeatedly call out its name. Meanwhile, obstacles and walls will hiss if the wearer moves in too close.
In one test, researchers created a virtual chair and directed Hololens wearers to approach object using the target mode. Most relied on a two-phase method: Localize the voice by turning in place and then quickly reach the correct destination. After that, researchers put a physical chair in the same location and asked the individuals to find that chair using their typical walking aid. The process took eight times longer and 13 times more distance without the help of Hololens.
Hololens can be used for long-range guided navigation, too. Researchers created a virtual guide that followed a pre-computed path and called out “follow me” to the wearer. It continuously monitored the wearer’s progression and remained a few feet ahead. If the wearer strayed off course, the virtual guide would stop and wait for him/her to catch up. The test included crossing a building’s main lobby, climbing two flights of stairs, walking around a few corners, and stopping in an office.
- Watch Meta demonstrate full-body VR tracking with just a Quest headset
- Hackers can now sneak malware into the GIFs you share
- Apple mixed-reality headset: Everything we know about Apple’s VR headset
- How using these prototype XR glasses sold me on mixed reality gaming
- We finally might know what Apple will call its AR/VR headset