Pretty rare, but every now and then technology comes up with something that improves life for some humans. Such is the case here, referring to the collaboration between the CDTM institute of the Technical University of Munich (TUM) and FRAMOS. The partnership yields a wearable using real-time 3D technology to support visually impaired people in daily life.
The solution consists of glasses equipped with Intel’s RealSense stereo cameras, which rely on intelligent algorithms to translate visual impressions into haptics and audio information. While audio information relies on object and character recognition, the haptic feedback is provided by a wrist band equipped with vibration motors. This new way of sensing enables visually impaired people to fully understand their environment and to have advanced guidance for safe navigation.
The prototype includes an Intel RealSense 3D camera and speakers for audio feedback. The setup is controlled by a processing hub with a GPS sensor and a LTE module for mobile data connection. Connected via Bluetooth, a micro-processing unit translates visual data into haptic-feedback through an 2D array of vibration motors. Based on the exact location and movement of the vibrating feedback on the arm, the visually impaired is informed about the position and distance of things in the surroundings. A voice controlled interface makes interaction easy and rechargeable batteries enable a full day of use.