While I’ve recently been discussing the emerging optical advancements coming out of Google’s X Lab with its new project, Project Glass, I’d like to point out some of the amazing work being done within the robotics industry that is not only cool but can be adapted for use by people with visual impairments.
New Scientist reported last week an optical system being developed by Edwige Pissaloux and colleagues at the Institute of Intelligent Systems and Robotics at the Pierre and Marie Curie University in Paris where the same technologies that help robots navigate “are being adapted to help blind people navigate indoor and outdoor spaces independently.” The system, which produces a 3D map of the wearer’s environment that is displayed on a handheld electronic Braille device, was unveiled at a talk at the Massachusetts Institute of Technology last month.
Now, how does the system work? As New Scientist explains:
“Two cameras on either side of the glasses generate a 3D image of the scene. A processor analyses the image, picking out the edges of walls or objects, which it uses to create a 3D map. The system’s collection of accelerometers and gyroscopes – like those used in robots to monitor their position – keeps track of the user’s location and speed. This information is combined with the image to determine the user’s position in relation to other objects.”
The blind person gathers the information through the Braille device, which is constantly updated – “almost 10 maps per second – to allow the individual to walk at a normal pace. “The Braille pad,” New Scientist details, “consists of an 8-centimetre-square grid of 64 taxels… When heat is applied to the springs, they expand, raising the pins to represent boundaries.” What’s amazing about Pissaloux’s system is that individuals will be receiving map information in real time, even as they walk from room to room and outside on a city street.
However, Pissaloux’s team isn’t the only robotics project testing the potential for use among the blind – and not the cheapest. At the University of Nevada in Reno, Eelke Folmer and Kostas Bekris are working on a low-cost system that uses software that “predicts how far a robot has travelled based on information from its on-board sensors.” What’s revolutionary about Folmer and Bekris’ project is that, if successful, blind individuals would be able to navigate simply with the use of a smartphone. Using “available 2D digital indoor maps and the smartphone’s built-in accelerometer and compass,” an individual is guided by synthetic speech. While simpler to use, the system isn’t as quick as Pissaloux’s.
Whatever system individuals find more beneficial, what’s emerging from the robotics industry (as well as many other technologically-inclined fields) is a push towards testing how already-established applications can be used for a better good. Advancements range from using smartphones to test STDs to glasses that can help the blind regain some acuity of large objects. While not as flashy, these advancements in robotics are going to make a huge difference for a large number of individuals.