The Institute of Artificial Intelligence at the University of Georgia (USA) has developed an artificial intelligence backpack using Intel Movidius VPU technology, which allows visually impaired people to navigate in real time using voice commands.
The kit, placed in a backpack, includes a programmable computing module and a volumetric AI-camera Luxonis OAK-D with artificial intelligence, attached to the user’s belt or clothing. The module uses Intel Movidius VPU technology and the Intel Distribution of OpenVINO toolkit required to provide real-time computer vision.
The operation of the complex is based on the transmission of voice commands to the user via Bluetooth, informing about the occurrence of various obstacles, such as road signs, objects hanging in space, pedestrian crossings, moving objects, and changes in the height of the road. Feedback is also possible when the user sends a voice request to the system via Bluetooth. The complex is equipped with a battery that provides 8 hours of battery life.
According to the WHO, there are about 285 million people with vision problems in the world now. Therefore, a new device with high autonomy, operating in real-time, and able to help with the patient’s spatial orientation has good commercial prospects.