Article

Real-Time Sign Detection for Accessible Indoor Navigation

Indoor navigation is a major challenge for people with visual impairments, who often lack access to visual cues such as informational signs, landmarks and structural features that people with normal vision rely on for wayfinding. We describe a new approach to recognizing and analyzing informational signs, such as Exit and restroom signs, in a building. This approach will be incorporated in iNavigate, a smartphone app we are developing, that provides accessible indoor navigation assistance. The app combines a digital map of the environment with computer vision and inertial sensing to estimate the user's location on the map in real time. Our new approach can recognize and analyze any sign from a small number of training images, and multiple types of signs can be processed simultaneously in each video frame. Moreover, in addition to estimating the distance to each detected sign, we can also estimate the approximate sign orientation (indicating if the sign is viewed head-on or obliquely), which improves the localization performance in challenging conditions. We evaluate the performance of our approach on four sign types distributed among multiple floors of an office building.

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.