scispace - formally typeset
Search or ask a question
Author

Yong Hee Jung

Bio: Yong Hee Jung is an academic researcher from Brunel University London. The author has contributed to research in topics: Gesture recognition & Mobile device. The author has an hindex of 1, co-authored 1 publications receiving 2 citations.

Papers
More filters
Proceedings Article
21 Nov 2011
TL;DR: A focus group that has dyslexia and other specific learning difficulties is used for designing the user-defined gesture sets and it is suggested that this result would help people with communication impairments conveniently interact with others in an intuitive and socially acceptable manner.
Abstract: Present smart phones contain high-tech sensors to monitor three-dimensional movements of the device and users' behaviours. These sensors allow mobile devices to recognize motion gestures. However, only a few gesture sets have been created, and little is known about best practices in motion-gesture design. Also, the created gesture sets were generated from people who do not have to use their motion gestures very much in their daily lives, not from people with communication difficulties. To address this issue, we use a focus group that has dyslexia and other specific learning difficulties for designing the user-defined gesture sets. This paper presents the results of our study that elicits the focus group's gestures to invoke commands on a smart-phone device. It demonstrates how the gesture sets have been designed and finalised throughout other research activities, such as observation and interviews. Finally, we suggest that our result would help people with communication impairments conveniently interact with others in an intuitive and socially acceptable manner.

2 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This work implemented motion gesture based interfaces with speech and vibration feedback for browsing phone books and making a call and indicated that motion gesture interfaces are more efficient than traditional button interfaces.
Abstract: Despite the existence of advanced functions in smartphones, most blind people are still using old-fashioned phones with familiar layouts and dependence on tactile buttons Smartphones support accessibility features including vibration, speech and sound feedback, and screen readers However, these features are only intended to provide feedback to user commands or input It is still a challenge for blind people to discover functions on the screen and to input the commands Although voice commands are supported in smartphones, these commands are difficult for a system to recognize in noisy environments At the same time, smartphones are integrated with sophisticated motion sensors, and motion gestures with device tilt have been gaining attention for eyes-free input We believe that these motion gesture interactions offer more efficient access to smartphone functions for blind people However, most blind people are not smartphone users and they are aware of neither the affordances available in smartphones nor the potential for interaction through motion gestures To investigate the most usable gestures for blind people, we conducted a user-defined study with 13 blind participants Using the gesture set and design heuristics from the user study, we implemented motion gesture based interfaces with speech and vibration feedback for browsing phone books and making a call We then conducted a second study to investigate the usability of the motion gesture interface and user experiences using the system The findings indicated that motion gesture interfaces are more efficient than traditional button interfaces Through the study results, we provided implications for designing smartphone interfaces

31 citations

Journal ArticleDOI
TL;DR: A systematic mapping of the literature is presented to identify research initiatives regarding the use of mobile devices and AAC solutions, pointing out to opportunities and challenges in this research domain, with emphasis on the need to promoting the use and effective adoption of assistive technology.
Abstract: Verbal communication is essential for socialization, meaning construction and knowledge sharing in a society. When verbal communication does not occur naturally because of constraints in people’s and environments capabilities, it is necessary to design alternative means. Augmentative and Alternative Communication (AAC) aims to complement or replace speech to compensate difficulties of verbal expression. AAC systems can provide technological support for people with speech disorders, assisting in the inclusion, learning and sharing of experiences. This paper presents a systematic mapping of the literature to identify research initiatives regarding the use of mobile devices and AAC solutions. The search identified 1366 potentially eligible scientific articles published between 2006 and 2016, indexed by ACM, IEEE, Science Direct, and Springer databases and by the SBC Journal on Interactive Systems. From the retrieved papers, 99 were selected and categorized into themes of research interest: games, autism, usability, assistive technology, AAC, computer interfaces, interaction in mobile devices, education, among others. Most of papers (57 out of 99) presented some form of interaction via mobile devices, and 46 papers were related to assistive technology, from which 14 were related to AAC. The results offer an overview on the applied research on mobile devices for AAC, pointing out to opportunities and challenges in this research domain, with emphasis on the need to promoting the use and effective adoption of assistive technology.

15 citations