The Massachusetts Institute of Technology (MIT) is doing some amazing work in terms of accessibility tools for visually impaired people. Recently, we wrote about an affordable device for translating Braille in real time, and now MIT researchers are back again with a new wearable device designed to help visually impaired people more easily navigate their environments.
The device comprises a 3D camera, a belt with five vibrational motors, and an electronically reconfigurable Braille interface to give users more information about their immediate environments.
“In a nutshell, our system scans the world and finds the walkable space and obstacles in front of the user with visual impairment,” Robert Katzschmann, a graduate student in mechanical engineering at MIT, told Digital Trends. “The user does not need to explore the space by contacting each part with a white cane. What makes the system especially exciting is that it can detect obstacles of use, such as chairs and tables. All the information is presented to the user through the use of vibrations around his or her abdomen, and through the use of an electronic Braille character display.”
Using technology similar to that employed to (literally and figuratively) drives 3D cars, the device relies on a system able to interpret 3D camera data. It involves smart image recognition algorithms to, for instance, recognize whether a chair is empty or not — rather than just writing it off as an obstacle to be avoided. Information can be conveyed to users surreptitiously, a particular motor vibrates if a person comes within two meters of an obstacle. They also receive information — such as whether it is a table or chair that has been detected — through reconfigurable Braille pads.
“Primarily, the real-world applications are day-to-day scenarios [in which a] user with visual impairment is confronted with navigating a cafeteria, finding his or her way around in a hotel lobby, or finding an empty chair in the bus or train,” Dr. Hsueh-Cheng Wang, a former postdoctoral researcher at MIT and now an assistant professor of electrical and computer engineering at National Chiao Tung University in Taiwan, told us.
In tests, the researchers found that the chair-finding system reduced subjects’ collisions with non-chair objects by 80 percent, while the separate navigation system reduced the number of cane collisions with people in a hallway by 86 percent.
“We plan [next] to extend this work from indoor to outdoor environments, and detect more objects a blind user wishes to interact with,” Katzschmann continued. Long term, the hope is to commercialize the technology, so as to bring it to whoever needs it.
- When diagnosis time means life or death, NVIDIA’s advanced AI can save lives
- These smart boots can achieve GPS-style accuracy indoors, no satellites required
- Wish you could fly? Here are the best drones on the market right now
- Tweaking these Fortnite game settings can give you a split-second edge
- Tecla Shield gives disabled people new choices for interacting with touchscreens