Loading...
Please wait, while we are loading the content...
A Wearable Navigation Device for Visually Impaired People Based on the Real-Time Semantic Visual SLAM System
Content Provider | MDPI |
---|---|
Author | Chen, Zhuo Liu, Xiao Ming Kojima, Masaru Huang, Qiang Arai, Tatsuo |
Copyright Year | 2021 |
Description | Wearable auxiliary devices for visually impaired people are highly attractive research topics. Although many proposed wearable navigation devices can assist visually impaired people in obstacle avoidance and navigation, these devices cannot feedback detailed information about the obstacles or help the visually impaired understand the environment. In this paper, we proposed a wearable navigation device for the visually impaired by integrating the semantic visual SLAM (Simultaneous Localization And Mapping) and the newly launched powerful mobile computing platform. This system uses an Image-Depth (RGB-D) camera based on structured light as the sensor, as the control center. We also focused on the technology that combines SLAM technology with the extraction of semantic information from the environment. It ensures that the computing platform understands the surrounding environment in real-time and can feed it back to the visually impaired in the form of voice broadcast. Finally, we tested the performance of the proposed semantic visual SLAM system on this device. The results indicate that the system can run in real-time on a wearable navigation device with sufficient accuracy. |
Starting Page | 1536 |
e-ISSN | 14248220 |
DOI | 10.3390/s21041536 |
Journal | Sensors |
Issue Number | 4 |
Volume Number | 21 |
Language | English |
Publisher | MDPI |
Publisher Date | 2021-02-23 |
Access Restriction | Open |
Subject Keyword | Sensors Industrial Engineering Wearable Device Semantic Segmentation Slam Assistance for Visually Impaired People Localization Semantic Map |
Content Type | Text |
Resource Type | Article |