Wearable haptic/Braille guidance system for the visually impaired

MIT researchers from their CSAIL (Computer Science and Artificial Intelligence Laboratory) unit have developed a system that is designed to aid the visually impaired in accurately navigating a room, with or without the assistance of a cane. It consists of a 3-D camera worn on the abdomen, a belt that has vibrational (haptic) motors, and an electronically controlled Braille interface worn on the side of the belt. The camera is worn on the chest as the optimum and least interfering body location. The pictures taken are analyzed by algorithms that quickly identify surfaces and their orientations from the planes in the photo, including whether or not a chair is unoccupied. The belt sends different frequency, intensity, and duration tactile vibrations to the wearer to help identify nearness to obstacles or to find a chair. The Braille interface also confirms the object and location through key initials (‘c’ for chair, ‘t’ for table) and directional arrows. According to the MIT study, “In tests, the chair-finding system reduced subjects’ contacts with objects other than the chairs they sought by 80 percent, and the navigation system reduced the number of cane collisions with people loitering around a hallway by 86 percent.” MIT News, Mashable, ‘Wearable Blind Navigation’ paper Hat tip to Toni Bunting of TASK Ltd.

 

Novartis extends ViaOpta app for visually impaired to smartwatches

[grow_thumb image=”https://telecareaware.com/wp-content/uploads/2015/07/ViaOpta-Nav-Apple-Watch.jpg” thumb_width=”200″ /]New to this Editor (and I suspect our Readers) in assistive technologies for wearables are Novartis’ ViaOpta apps to aid the visually impaired and facilitate their independence as well as safety. Now available for smartwatches as well as smartphones, its features are primarily centered on assisting with navigation, including (new) points of interest and worldwide maps. Another new feature, recognizing and ‘reading’ both scenes and objects, is for smartphone only because it is dependent on the phone’s camera. Cues are both audio, vibration and visual (see left). Novartis claims this is the first wearable app designed specifically for the visually impaired and blind, now estimated at 285 million worldwide. It’s available in 11 languages and best of all, it’s a free download. Smartwatch demo is on YouTube. Novartis release. Mobihealthnews.

MIT’s ‘FingerReader’ to aid sight-impaired in reading

[grow_thumb image=”https://telecareaware.com/wp-content/uploads/2014/07/finger_reader_mit.jpg” thumb_width=”150″ /]MIT Media Lab is developing a chunky plastic ring that in concept and early stage prototype, assists the sight-impaired in reading normal 12 pt. text in a book, magazine or on screen. The ring is worn on the hand (resembles a collar) and the reader points their finger along the line to be read. The camera embedded in the ring scans line by line and ‘speaks’ through speakers on a PC or tablet connected to the ring. If the finger strays too far from a line, there is a dial-tone like feedback sound. It is different than the conceptually similar Reading Pen as being more strongly in real time and reading faster–whole lines rather than word by word. While primarily for the blind and low vision, one of the MIT developers, Roy Shilkrot, a doctoral candidate, envisions simultaneous (machine) translation to another language. With a market of 285 million visually impaired worldwide–85 percent are over 50 (WHO)–there’s a ready-made market right there and for technologies like the Oxford ‘assisted vision’ project [TTA 11 July]. Mr. Shilkrot is shy on the commercialization subject, but given the positive media reception, he should perhaps think it over. TechCrunch (includes video demo), Mashable, MIT’s release and FAQ. Hat tip to reader Luca Sergio of Ethis Communications/Ethis Healthtech, New York