scispace - formally typeset
Search or ask a question

Showing papers on "Turn-by-turn navigation published in 2018"


Proceedings ArticleDOI
19 Apr 2018
TL;DR: This work proposes two measures of path-following behavior: deviation from optimal route and trajectory variability and identifies relationships between these measures and elements of the environment, route characteristics, localization error, and instructional cues that users receive.
Abstract: Indoor localization technologies can enhance quality of life for blind people by enabling them to independently explore and navigate indoor environments. Researchers typically evaluate their systems in terms of localization accuracy and user behavior along planned routes. We propose two measures of path-following behavior: deviation from optimal route and trajectory variability. Through regression analysis of real-world trajectories from blind users, we identify relationships between a) these measures and b) elements of the environment, route characteristics, localization error, and instructional cues that users receive. Our results provide insights into path-following behavior for turn-by-turn indoor navigation and have implications for the design of future interactions. Moreover, our findings highlight the importance of reporting these environmental factors and route properties in similar studies. We present automated and scalable methods for their calculation and to encourage their reporting for better interpretation and comparison of results across future studies.

43 citations


Proceedings ArticleDOI
08 Oct 2018
TL;DR: This contribution analyzes a dataset of indoor trajectories of 11 blind participants guided along three routes through a multi-story shopping mall using NavCog, a turn-by-turn smartphone navigation assistant and finds that participants extend rotations by 17º on average.
Abstract: Navigation assistive technologies aim to improve the mobility of blind or visually impaired people. In particular, turn-by-turn navigation assistants provide sequential instructions to enable autonomous guidance towards a destination. A problem frequently addressed in the literature is to obtain accurate position and orientation of the user during such guidance. An orthogonal challenge, often overlooked in the literature, is how precisely navigation instructions are followed by users. In particular, imprecisions in following rotation instructions lead to rotation errors that can significantly affect navigation. Indeed, a relatively small error during a turn is amplified by the following frontal movement and can lead the user towards incorrect or dangerous paths. In this contribution, we study rotation errors and their effect on turn-by-turn guidance for individuals with visual impairments. We analyze a dataset of indoor trajectories of 11 blind participants guided along three routes through a multi-story shopping mall using NavCog, a turn-by-turn smartphone navigation assistant. We find that participants extend rotations by 17o on average. The error is not proportional to the expected rotation; instead, it is accentuated for "slight turns" (22.5o-60o), while "ample turns" (60o-120o) are consistently approximated to 90o. We generalize our findings as design considerations for engineering navigation assistance in real-world scenarios.

26 citations


Journal ArticleDOI
18 Sep 2018
TL;DR: A data-driven analysis of reaction variability as defined by motion and timing measures finds significant variability between users in their reaction characteristics during real-world navigation with a deployed system.
Abstract: 'Turn slightly to the left' the navigational system announces, with the aim of directing a blind user to merge into a corridor. Yet, due to long reaction time, the user turns too late and proceeds into the wrong hallway. Observations of such user behavior in real-world navigation settings motivate us to study the manner in which blind users react to the instructional feedback of a turn-by-turn guidance system. We found little previous work analyzing the extent of the variability among blind users in reaction to different instructional guidance during assisted navigation. To gain insight into how navigational interfaces can be better designed to accommodate the information needs of different users, we conduct a data-driven analysis of reaction variability as defined by motion and timing measures. Based on continuously tracked user motion during real-world navigation with a deployed system, we find significant variability between users in their reaction characteristics. Specifically, the statistical analysis reveals significant variability during the crucial elements of the navigation (e.g., turning and encountering obstacles). With the end-user experience in mind, we identify the need to not only adjust interface timing and content to each user's personal walking pace, but also their individual navigation skill and style. The design implications of our study inform the development of assistive systems which consider such user-specific behavior to ensure successful navigation.

15 citations


Proceedings ArticleDOI
01 Dec 2018
TL;DR: This system is used to provide an efficient approach to translate natural language directions to a machine-understandable format and will benefit the development of voice-based navigation-oriented humanmachine interface.
Abstract: In a highly evolving technical era, Voice-based Navigation Systems play a major role to bridge the gap between human and machine. To overcome the difficulty in taking and understanding user's voice commands, simulating the natural language, process the route with user's turn by turn directions while mentioning key entities like street names, landmarks, point of interests, junctions and map the route in an interactive interface, we propose a user-centric roadmap navigation mobile application called “Direct Me”. The approach of generating the user preferred route, system will first convert the audio streams into text through Automatic Speech Recognizer (ASR) using Pocket Sphinx Library, followed by Natural Language Processing (NLP) by utilizing Stanford CoreNLP Framework to retrieve the navigation-associated information and process the route in the Map using Google Map API upon the user request. This system is used to provide an efficient approach to translate natural language directions to a machine-understandable format and will benefit the development of voice-based navigation-oriented humanmachine interface.

7 citations



Proceedings ArticleDOI
01 Dec 2018
TL;DR: This system is used to provide an effective approach to translating natural language commands into a format that can be fully understood by machine and will benefit in the development of human-machine-oriented interface.
Abstract: In a highly technological era, voice-based navigation systems play a major role in bridging the gap between man and machine. To overcome the difficulty in understanding the user's voice commands and natural language simulations, process the path with the user's turn by turn directions with the mention of key entities such as street names, landmarks, points of interest, connections and path mapping in an interactive interface, we propose a user-centric roadmap navigation mobile application called “Direct Me”. To generate the user's preferred path, the system will first convert audio streams to text through ASR using the Pocket Sphinx library, followed by Natural Language Processing (NLP) by taking advantage of Stanford CoreNLP Framework to retrieve navigation-related information and handle the path in the map using the Google Map API at the user's request. This system is used to provide an effective approach to translating natural language commands into a format that can be fully understood by machine and will benefit in the development of human-machine-oriented interface.

4 citations