Novel continuous authentication using biometrics
01 Nov 2017-Vol. 263, Iss: 4, pp 042041
TL;DR: In this paper, the authors proposed a new way of authentication of user which captures many images of user in random time and ana1ysis of its touch biometric behavior. But the system is not suitable for the use of face recognition.
Abstract: We explore whether a classifier can consistent1y verify c1ients and interact with the computer using camera and behavior of users. In this paper we propose a new way of authentication of user which wi1l capture many images of user in random time and ana1ysis of its touch biometric behavior. In this system experiment the touch conduct of a c1ient/user between an en1istment stage is stored in the database and it is checked its mean time behavior during equa1 partition of time. This touch behavior wi1l ab1e to accept or reject the user. This wi1l modify the use of biometric more accurate to use. In this system the work p1an going to perform is the user wi1l ask single time to a1low to take it picture before 1ogin. Then it wi1l take images of user without permission of user automatica1ly and store in the database. This images and existing image of user wi1l be compare and reject or accept wi1l depend on its comparison. The user touch behavior wi1l keep storing with number of touch make in equa1 amount of time of the user. This touch behavior and image wi1l fina1ly perform authentication of the user automatically.
TL;DR: A classification framework that learns the touch behavior of a user during an enrollment phase and is able to accept or reject the current user by monitoring interaction with the touch screen is proposed.
Abstract: We investigate whether a classifier can continuously authenticate users based on the way they interact with the touchscreen of a smart phone. We propose a set of 30 behavioral touch features that can be extracted from raw touchscreen logs and demonstrate that different users populate distinct subspaces of this feature space. In a systematic experiment designed to test how this behavioral pattern exhibits consistency over time, we collected touch data from users interacting with a smart phone using basic navigation maneuvers, i.e., up-down and left-right scrolling. We propose a classification framework that learns the touch behavior of a user during an enrollment phase and is able to accept or reject the current user by monitoring interaction with the touch screen. The classifier achieves a median equal error rate of 0% for intrasession authentication, 2%-3% for intersession authentication, and below 4% when the authentication test was carried out one week after the enrollment phase. While our experimental findings disqualify this method as a standalone authentication mechanism for long-term authentication, it could be implemented as a means to extend screen-lock time or as a part of a multimodal biometric authentication system.
TL;DR: The results suggest that this is due to the ability of HMOG features to capture distinctive body movements caused by walking, in addition to the hand-movement dynamics from taps.
Abstract: We introduce hand movement, orientation, and grasp (HMOG), a set of behavioral features to continuously authenticate smartphone users. HMOG features unobtrusively capture subtle micro-movement and orientation dynamics resulting from how a user grasps, holds, and taps on the smartphone. We evaluated authentication and biometric key generation (BKG) performance of HMOG features on data collected from 100 subjects typing on a virtual keyboard. Data were collected under two conditions: 1) sitting and 2) walking. We achieved authentication equal error rates (EERs) as low as 7.16% (walking) and 10.05% (sitting) when we combined HMOG, tap, and keystroke features. We performed experiments to investigate why HMOG features perform well during walking. Our results suggest that this is due to the ability of HMOG features to capture distinctive body movements caused by walking, in addition to the hand-movement dynamics from taps. With BKG, we achieved the EERs of 15.1% using HMOG combined with taps. In comparison, BKG using tap, key hold, and swipe features had EERs between 25.7% and 34.2%. We also analyzed the energy consumption of HMOG feature extraction and computation. Our analysis shows that HMOG features extracted at a 16-Hz sensor sampling rate incurred a minor overhead of 7.9% without sacrificing authentication accuracy. Two points distinguish our work from current literature: 1) we present the results of a comprehensive evaluation of three types of features (HMOG, keystroke, and tap) and their combinations under the same experimental conditions and 2) we analyze the features from three perspectives (authentication, BKG, and energy consumption on smartphones).
TL;DR: A continuous behaviometric authentication system is tested on 99 users over 10 weeks, focusing on keystroke dynamics, mouse movements, application usage, and the system footprint, with no false rejects.
Abstract: A continuous behaviometric authentication system is tested on 99 users over 10 weeks, focusing on keystroke dynamics, mouse movements, application usage, and the system footprint. In the process, a new trust model was created to enable continuous evaluation of the behaviometric data. Tests targeted keystroke dynamics, mouse movements, application usage, and the system footprint. Keystroke dynamics was found most appropriate for continuous behaviometric authentication, with no false rejects. This article is part of a special issue on security.
Related Papers (5)
01 Jan 2002
01 Jan 2016