In 2020, 43.2 million people worldwide were affected by blindness and 295.3 million people suffered from mild to severe visual impairments. Impairments in visual perception present those affected with a wide range of challenges in everyday life, often resulting in limited participation in everyday life and a reduced quality of life.
The research project LOMOBI aims to develop an interactive assistance system for the navigation of visually impaired people in everyday life. Technologies from the fields of computer vision and mobile robotics are being adapted and further developed. The backpack developed in previous research projects to assist joggers (see picture) serves as a guide. Navigation signals for the person can be derived and transmitted by means of an environment detection system and a vibrotactile and auditory interface (vibration motors / bone headphones).
Maps available online, such as from OpenStreetMap, provide a wealth of information that can be used to navigate and/or assist those affected. The long-term goal is to enable visually impaired people to navigate in unfamiliar environments through external map material. This information can be used not only for navigation of visually impaired people but also for intelligent wheelchairs.
In a first step, the aim of this work is to evaluate relevant and available data for navigation and for the support of impaired persons. Based on this, a software module for the Robot Operating System (ROS) will be implemented that provides the relevant information. Depending on the previous knowledge, the aim is also to extend the navigation on the basis of this data and to improve the localisation.
Notes / Requirements
Basic programming knowledge should be available. The software module is to be implemented in ROS, so prior knowledge of C++ or Python and ROS is advantageous, but not necessarily required. The scope and objectives can be adapted according to the previous knowledge.
Further information is available on request by email.