scispace - formally typeset
Open AccessProceedings ArticleDOI

You Are How You Touch: User Verification on Smartphones via Tapping Behaviors

Reads0
Chats0
TLDR
This work proposes a non-intrusive user verification mechanism to substantiate whether an authenticating user is the true owner of the smart phone or an impostor who happens to know the pass code.
Abstract
Smartphone users have their own unique behavioral patterns when tapping on the touch screens These personal patterns are reflected on the different rhythm, strength, and angle preferences of the applied force Since smart phones are equipped with various sensors like accelerometer, gyroscope, and touch screen sensors, capturing a user's tapping behaviors can be done seamlessly Exploiting the combination of four features (acceleration, pressure, size, and time) extracted from smart phone sensors, we propose a non-intrusive user verification mechanism to substantiate whether an authenticating user is the true owner of the smart phone or an impostor who happens to know the pass code Based on the tapping data collected from over 80 users, we conduct a series of experiments to validate the efficacy of our proposed system Our experimental results show that our verification system achieves high accuracy with averaged equal error rates of down to 365% As our verification system can be seamlessly integrated with the existing user authentication mechanisms on smart phones, its deployment and usage are transparent to users and do not require any extra hardware support

read more

Content maybe subject to copyright    Report

Technical Report WM-CS-2012-06
College of
William & Mary
Department of Computer
Science
WM-CS-2012-06
You Are How You Touch:
User Verification on Smartphones via Tapping Behaviors
Nan Zheng, Kun Bai, Hai Huang, and Haining Wang
December 06, 2012

You Are How You Touch:
User Verification on Smartphones via Tapping Behaviors
Nan Zheng
*
, Kun Bai
, Hai Huang
, and Haining Wang
*
*
College of William and Mary
IBM T.J. Watson Research Center
Williamsburg, VA, USA Hawthorne, NY, USA
{nzheng,hnw}@cs.wm.edu {kunbai,haih}@us.ibm.com
ABSTRACT
Smartphone users have their own unique behavioral pat-
terns when tapping on the touch screens. These pe rsonal
patterns are reflected on the dierent rhythm, strength, and
angle preferences of the applied force. Since smartphones are
equipp ed with various sensors like accelerometer, gyroscope,
and touch screen sensors, capturing a user’s tapping behav-
iors can be done seamlessly. Exploiting the combination
of four features (acceleration, pressure, size, and time) ex-
tracted from smartphone sensors, we propo se a non-intrusive
user verification mechanism to substantiate whether an au-
thenticating user is the true owner of the smartphone or an
imp ostor who happens to know the passcode. Based on the
tapping data collected from over 80 users, we conduct a se-
ries of experiments to validate the ecacy of our proposed
system. Our experimental results show that our verifica-
tion system achieves high accuracy with averaged equal error
rates of down to 3.65%. As our verification system can be
seamlessly integrated with the existing user authentication
mechanisms on smartphones, its de ployment and usage are
transparent to users and do not require any extra hardware
supp ort.
Keywords
Smartphone, tapping behavior, user verification
1. INTRODUCTION
Smartphones have b ecome ubiquitous computing platforms
allowing users anytime access to the Internet and many on-
line services. On one hand, as a personal device, a smart-
phone contains important private information, such as text
messages, always-logged-in emails, and contact list. On the
other hand, as a portable device, a smartphone is much eas-
ier to get lost or stolen than conventional computing plat-
forms. Therefore, to prevent the private information stored
in smartphones from falling into the hands of adversaries,
user authentication mechanisms have been integrated into
mobile OSes like Android and iOS.
Due to having a much smaller screen and keyboard on a
smartphone than the traditional user input/output devices,
PIN-based and pattern-based passcode systems have been
widely used in smartphones for user authentication. How-
ever, many people tend to choose weak passcodes for ease of
memorization. A 2011 survey on iPhone 4-digit passcode re-
veals that the ten most popular passcodes represent 15% of
all 204,508 passcodes and the top three are 1234, 0000, and
2580 [1]. Moreover, recent studies show that an attacker can
detect the location of s creen taps on smartphones based on
accelerometer and gyroscope readings and then derive the
letters or numbe rs on the screen [6, 23, 27, 32]. An attacker
could even exploit the oily residues left on the screen of a
smartphone to derive the passcode [2]. Therefore, it is highly
desirable to enhance the smartphone’s user authentication
with a non-intrusive user verification mechanism, which is
user-transparent and is able to furt her verify if the success-
fully logged-in user is the true owner of a smartphone.
In this paper, we explore the feasibility of utilizing user
tapping behaviors for user verification in a passcode-enabled
smartphone. The rationale behind our work is that individ-
ual human users have their own unique behavioral patterns
while tapping on the touch screen of a smartphone. In other
words, you are how you touch on the screen, just like you
are how you walk on the street. The rich variety of sen-
sors equipped with a smartphone including accelerometer,
gyroscop e, and touch screen sensors, make it possible to ac-
curately characterize an individual user’s tapping behaviors
in a fine-grained fashion. With over 80 smartphone users
participated in our study, we quantify the user tapping be-
haviors in four dierent aspects: acceleration, pressure, size,
and time. Based on the behavioral metrics extracted from
these four features, we apply the one-class learning tech-
nique for building an accurate classifier, which is the core of
our user verification system.
We evaluate the eectiveness of our system through a se-
ries of experiments using the empirical data of both 4-digit
and 8-digit PINs. In terms of accuracy, our approach is
able to classify the legitimate user and impostors with aver-
aged equal error rates of down to 3.65%. Overall, our ver-
ification system can significantly enhance the security of a
smartphone by accurately identifying impostors. Especially
for practical use, our tapping-behavior-based approach is
user-transparent and the usability of traditional passcodes
on a smartphone remains intact. As our approach is non-
intrusive and does not need additional hardware support
from smartphones, it can be seamlessly integrated with the
existing passcode-based authentication systems.
The remainder of the paper is structured as follows. Sec-
tions 2 and 3 review the background and related work in the
area of smartphone user authentication, respectively. Sec-
tion 4 describes our data collection and measurement, in-
cluding our choice of metrics. Section 5 details the proposed
classifier for user verification. Section 6 presents our exper-
imental design and results. Section 7 di scusses iss ues which
arise from the details of our approach, and finally Section 8
concludes.
1

2. BACKGROUND
The tapping behaviors of individual users on touchscreen
vary from person to person due to dierences in hand ge-
ometry and finger agility. Each user has a unique personal
tapping pattern, reflected on the dierent rhythm, strength,
and angle preferences of the applied force. As our tapping-
b ehavior-based approach verifies the owner of a smartphone
based on“who you are” your physical and behavioral traits,
instead of “what you know”, it belongs to biometrics-based
user authentication. In general, a biometrics authentica-
tion system authenticates users either by their physiological
traits like faces and voices [5, 21] or behavioral patterns like
finger typing and hand movements [24, 34].
While physiological traits can achieve high accuracy in the
pro cess of user authentication, they have not been widely
used in mobile devices. Recent studies have also shown that
the physiology-based mechanisms deployed in mobile devices
are sensitive to certain environmental factors, which could
significantly diminish their accuracy and reliability. For ex-
ample, face recognition may fail due to a dierent viewing
angle and poor illumination [28], and voice recognition de-
grades due to background noise [5]. However, given the same
mobile device, behavioral biometrics tend to be less sensi-
tive to the surrounding environmental factors like darkness
or noise.
Exploiting the behavioral information captured by mul-
tiple sensors on a smartphone, we can exclusively create a
detailed user profile for verifying the owner of the smart-
phone. Since our approach works seamlessly with the exist-
ing passcode-based user authentication mechanisms in mo-
bile devices, it plays a role of implicit authentication. In
other words, our approach can act as a second factor authen-
tication method and supplement the passcode systems for
stronger authentication in a cost-eective and user-transparent
manner. More recently, seminal works have been proposed
to explore the feasibility of user verification employing the
b ehaviors of pattern-based passwords [12]. However, the
false reject rate (FRR) of their work is rather high, which
means there is a high chance that the owner of a mobile
device would be mistakenly regarded as an impostor and be
blo cked from accessing the device.
3. RELATED WORKS
A) Keystroke Dynamics and Graphical Passwords. Keystroke
dynamics has been extensively studied in distinguishing users
by the way they type their personal identification number
(PIN) based passwords [19]. Research done on the analysis
of keystroke dynamics for identifying users as they type on
a mobile phone can be found in [3, 9, 10, 18, 25, 33]. Clarke
et al. [10] considered the dynamics of typing 4-digit PIN
code s, in which the researchers achieve an average Equal
Error Rate (ERR) of 8.5% on physical keyboard on a Nokia
5110 handset. Zahid et al. [33] examined this approach on
touchscreen keyboards and achieve, in one best scenario, a
low Equal Error Rate of approximately 2% with training set
required a minimum of 250 keystrokes.
Researchers have also suggested the use of graphical pass-
words as an easier alternative to text-based passwords [7,
17], based on the idea that people have a better ability to
recall images than texts. A good overview of popular graph-
ical password schemes has been reported in [4]. Chang et
al. [7] proposed a graphics-based password KDA system
for touchscreen handheld mobile devices. The experiment
results show that EER is 12.2% in the graphics-based pass-
word KDA proposed system, and EER is reduced to 6.9%
when the pressure feature is used in the proposed system.
Dierent usabi lity studies have outlined the advantages of
graphical passwords, such as their reasonable login and cre-
ation times, acceptable error rates, good general perception
and reduced interference compared to text passwords, but
also their vulnerabilities [31].
B) Inferring Tapped Information from On-board Motion Sen-
sors. Several independent researches have found that sim-
ply by using data acquired by smartphone motion sensors,
it is sucient to infer which part of the screen users tap
on [6, 23, 27, 32]. The firs t eort was done by Cai et al.
in 2011 [6]. They utilized features from device orientation
data on an HTC Evo 4G smartphone, and correctly inferred
more than 70% of the keys typed on a number-only soft key-
b oard. Very soon, X u et al. further exploited more sensor
capabilities on smartphones, including accelerometer, gyro-
scop e, and orientation sensors [32]. Evaluation shows higher
accuracies of greater than 90% for inferring an 8-digit pass-
word wi thin 3 trials. Miluzzo et al. demonstrated another
key inference method on soft keyboard of both smartphones
and tablets [23]. 90% or higher accuracy is shown in iden-
tifying English letters on smartphones, and 80% on tablets.
Owusu et al. [27] infers taps o f keys and areas arranged in
a 60-region grid, solely based on accelerometer readings on
smartphones. Result showed that they are able to extract
6-character passwords in as few as 4.5 trials.
C) User Authentication by Their Behavior on Touch S creens.
Research has b een done in exploring di erent biometric ap-
proaches for providing an extra level of security for authen-
ticating users into their mobile devices. Guerra-Casanova et
al. [15] proposed a biometric technique based on the idea
of authenticating a person on a mobile device by gesture
recognition, and achieve Equal Error Rate (EER) between
2.01% and 4.82% on a 100-users base. Unobtrusive methods
for authentication on mobile smart phones have emerged as
an alternative to typed passwords, such as gait biometrics
(achieving an EER of 20.1%) [13, 2 6], or the unique move-
ment users perform when answering or placing a phone call
(EER being between 4.5% and 9.5%) [11].
Very recently De Luca et al. [12] introduced an implicit
authentication approach that enhances password patterns
on android phones, with an additional security layer, which
is transparent to user. The application recorded all data
available from the touchscreen: pressure (how hard the fin-
ger presses), size (area of the finger touching the screen),
x and y coordinates, and time. Evaluation is based on 26
participants, with an average accuracy of 77%.
A latest work conducted by Sae-Bae et al. [29] makes use
of multi-touch screen sensor on iPad (not phone) to capture
the palm movement. They achieved a classification accuracy
of over 90%. However, palm movements is not suitable for
smartphone screens, since the screen is typically too small
for palm movements. Citty et al. [8] presented an alternative
approach to inputting PINs on small touchscreen devices. It
uses a sequence of 4 partitions of a selection of 16 images,
instead of 4- digits PINs, to increase the possible combina-
tion of authentication sequences. However, inputting the
sequence needs extra eorts in memorizing the images se-
2

Table 1: Collected Data
PIN Users Actions Average Actions Per User Filtered-Out
3-2-4-4 53 1,751 33 0.80%
1-1-1-1 41 2,577 63 2.64%
5-5-5-5 42 2,756 66 3.70%
1-2-5-9-7-3-8-4 27 1,939 72 7.37%
1-2-5-9-8-4-1-6 25 2,039 82 4.76%
quences. Kim et al. [20] introduced and evaluated a num-
b er of novel tabletop authentication schemes that exploit
the features of multi-touch interaction.
4. MEASUREMENT AND CHARACTERIZA-
TION
Over 80 participants are involved in our data collection.
Five dierent PINs are tested, in which three of them are
4-digit, and two are 8-digit. Here we choose PINs 3-2-4-4, 1-
2-5-9-7-3-8-4, and 1-2-5-9-8-4-1-6 to represent these normal
cases, but PINs 1-1-1-1 and 5-5-5-5 to represent the two
extreme cases, one at the corner and the other at the center,
respect ively. Each participant is asked to enter an error-free
PIN for at least 25 times and we collect a total of 11,062
error-free actions. The user’s timing and motion data are
recorded during the process. In this paper, we refer to an
action (or user input action) as the process of tapping one
PIN, instead of individual digits. The detailed information
of the collected data is listed in Table 1.
(a) Application
Layout
(b) Two-Hand Typing
Figure 1: Screen layout of our data collection appli-
cation, and the two-hand typing action.
The timing information is in resolution of milliseconds.
Occasionally, some participants fail to make a smooth tap-
ping intentionally or unintentionally. Therefore, we employ
a simple outlier removal process to all the collected raw data.
An outlier tapping action is often signaled by a markedly
longer-than-usual time interval, especially for a user who is
very familiar with its own PIN. In our data set, a smooth
PIN tapping action takes at most 600 milliseconds between
subsequent keys for all participants. As a result, an inter-
key time of greater than one second always signals such an
outlier behavior. By this standard, a small amo unt of raw
data is filtered out, as listed in the right-most column of
Table 1.
Table 2: Features of Touchscreen Tapping Behaviors
# of Dimensions
Feature Set Description 4-digit 8-digit
Acceleration At TouchDown 816
(linear & At TouchUp 816
angular) Min in key-hold 8 16
Max in key-hold 8 16
Mean in key-hold 8 16
Pressure At TouchDown 48
At TouchUp 48
Touched Size At TouchDown 48
At TouchUp 48
Time Key hold time 4 8
Inter-key time 3 7
Total All features 63 127
All the data are collected on a Samsung Galaxy Nexus.
Its fastest sampling ra te on motion sensor readings is about
100Hz. Figure 1(a) shows the layout of our Android ap-
plication for the data collection. In the experiments, all
the participants are asked to hold the phone with their left
hands, and tap with their right hand index fingers, as shown
in Figure 1(b).
We make use of the Android APIs to detect the touch
event, including both key-press and key-release. Between
each key-press and key-release, we record raw data of times-
tamps, acceleration, angular acceleration, touched-size, and
pressure. Acceleration and angular accel eration are from
API SensorEvent, while touched-size and pressure are from
API MotionEvent.
4.1 Feature Extraction
Based on the raw data, we compute four sets of features
for each PIN typing action: acceleration, pressure, size, and
time. We describe each of them in the following:
Acceleration: For each digit d in a PIN ac tion, we
calculate the five acceleration values:
A
d,1
: t he magnitude of acceleration when the digit
d is pressed down;
A
d,2
: t he magnitude of acceleration when the digit
d is released;
A
d,3
: the maximum value of magnitude of accel-
eration during digit d key-press to key-release;
3

X
1
!
X
2
!
Target User!
Impostors!
Boundary!
Figure 2: An illustration of two-feature space of a
target user and m any others. X
1
and X
2
are the
two features. The dashed lines define the boundary
of the target user’s behavior. Because the target
user’s behavior is limited to a concentrated area,
the boundary blocks the majority of potential im-
postors.
A
d,4
: the minimum value of magnitude of accel-
eration during digit d key-press to key-release;
A
d,5
: the average value of magnitude of accelera-
tion during digit d key-press to key-release.
All above values are the magnitude of acceleration
k~a k =
p
a
2
x
+ a
2
y
+ a
2
z
. We choose not to use individ-
ual components, because the phone co ordinate system
is sensitive to location change. A similar procedure is
applied to calculate the features from angular accel-
erations. Combining both acceleration- and angular-
acceleration-related features, there are total of 40 in a
4-digit PIN action and 80 in an 8-digit PIN action.
Pressure: We obtain the pressure readings through
Android API MotionEvent.getpressure(). The re-
turned pressure measurements are of an abstract unit,
ranging from 0 (no pressure at all) to 1 (normal pres-
sure), however the values higher than 1 could occur
dep ending on the calibration of the input device (ac-
cording to Android API documents). In the feature
set, we include pressure readings at both key-press and
key-release. There are 8 pressure-related features for
a 4-digit PIN, and 16 for an 8-digit PIN.
Size: Similar to pressure re adings, another Android
API call MotionEvent.getsize() measures the touched
size, associated with each touch event. According to
Android document, it returns a scaled value of the
approximate size for the given pointer index. This
represents the approximation of the screen area being
pressed. The actual value in pixels corresponding to
the touch is normalized with the device’s specific range
and is scaled to a value between 0 and 1. For each key-
press and key-release, we record the size readings and
include in the feature set. A 4-digit PIN contains 8
size-related features, and an 8-digit PIN contains 16.
0
10
20
0
10
20
0
10
20
0 500 1000 1500 2000
Repetition Index
Time [ms]
User
#1
User
#2
User
#3
3 2 4 4
3 2 4 4
3 2 4 4
Figure 3: Timing of tapping on the smartphone from
three dierent users, shown in three vertical pan-
els. Each user typ ed 20 times of the number string
“3244”. The solid dots represent key-press time, and
the open dots are key-release time. Dierent colors
represent the timestamps of dierent digits.
Time: key-hold times and inter-key time intervals be-
tween two nearby keys. They are measured from the
TouchEvent timestamps, of both TouchUps and TouchDowns.
Overall, a 4-digit PIN action contains 7 time-related
features, while 8-digit PIN contains 15.
For a 4-digit PIN, each action results in a total of 63 fea-
tures; for an 8-digit PIN, the number of features for one
action is 127. Table 2 summarizes the description of the
ab ove four feature sets.
4.2 Touchscreen Tapping Characterization
Our underlying assumption i s that a user’s feature dis-
tribution should be clustered within a reliably small range
compared with many others. As a result, those metrics can
b e exploited to block the majority of impostors, as illus-
trated in Figure 2.
4.2.1 Uniqueness of User Pattern
As described above, we define four sets of features in order
to characterize a user’s tapping behaviors on smartphones:
acceleration (both linear and angular), pressure, size, and
time. All these features can be easily obtained from a smart-
phone’s on-board sensors, and can accurately characterize
a user’s unique tapping behaviors. Based on the feature
data, we observe that each user demonstrates consistent and
unique tapping behaviors, which can be utilized for dier-
entiating itself from other users.
Figure 3 shows the timestamps of entering the same PIN
3-2-4-4 from three dierent users, including the moments
of each key-press and key-release. Each individual’s tim-
ing patterns clearly dier, but are very consistent within
themselves. This is similar to the observations on a regular
computer keyboard [22].
In addition to timing information, motion data such as
pressure, touched size, and acceleration also reveal user-
sp ecific patterns. Generally speaking, acceleration is pro-
p ortional to the tapping force applied to the touchscreen,
while angular acceleration represents the moment of force.
4

Citations
More filters
Journal ArticleDOI

HMOG: New Behavioral Biometric Features for Continuous Authentication of Smartphone Users

TL;DR: The results suggest that this is due to the ability of HMOG features to capture distinctive body movements caused by walking, in addition to the hand-movement dynamics from taps.
Proceedings ArticleDOI

Secure unlocking of mobile touch screen devices by simple gestures: you can see it but you can not do it

TL;DR: Unlike existing authentication schemes for touch screen devices, which use what user inputs as the authentication secret, GEAT authenticates users mainly based on how they input, using distinguishing features such as finger velocity, device acceleration, and stroke time.
Proceedings ArticleDOI

SilentSense: silent user identification via touch and movement behavioral biometrics

TL;DR: SilentSense, a framework to authenticate users silently and transparently by exploiting the user touch behavior biometrics and leveraging the integrated sensors to capture the micro-movement of the device caused by user's screen-touch actions, is presented.
Proceedings Article

Towards Continuous and Passive Authentication via Touch Biometrics: An Experimental Study on Smartphones

TL;DR: This work adopts a continuous and passive authentication mechanism based on a user’s touch operations on the touchscreen that is suitable for smartphones, as it requires no extra hardware or intrusive user interface.
Journal ArticleDOI

A Perspective Analysis of Handwritten Signature Technology

TL;DR: A systematic review of the last 10 years of the literature on handwritten signatures with respect to the new scenario is reported, focusing on the most promising domains of research and trying to elicit possible future research directions in this subject.
References
More filters
Journal ArticleDOI

An introduction to biometric recognition

TL;DR: A brief overview of the field of biometrics is given and some of its advantages, disadvantages, strengths, limitations, and related privacy concerns are summarized.
Proceedings Article

The design and analysis of graphical passwords

TL;DR: This work proposes and evaluates new graphical password schemes that exploit features of graphical input displays to achieve better security than text-based passwords and describes the prototype implementation of one of the schemes on a personal digital assistants (PDAs) namely the Palm PilotTM.
Journal ArticleDOI

Touchalytics: On the Applicability of Touchscreen Input as a Behavioral Biometric for Continuous Authentication

TL;DR: A classification framework that learns the touch behavior of a user during an enrollment phase and is able to accept or reject the current user by monitoring interaction with the touch screen is proposed.

Smudge attacks on smartphone touch screens

TL;DR: This paper examines the feasibility of smudge attacks on touch screens for smartphones, and focuses on the Android password pattern, and provides a preliminary analysis of applying the information learned in a smudge attack to guessing an Android passwordpattern.
Related Papers (5)
Frequently Asked Questions (9)
Q1. What are the contributions mentioned in the paper "You are how you touch: user verification on smartphones via tapping behaviors" ?

Exploiting the combination of four features ( acceleration, pressure, size, and time ) extracted from smartphone sensors, the authors propose a non-intrusive user verification mechanism to substantiate whether an authenticating user is the true owner of the smartphone or an impostor who happens to know the passcode. Based on the tapping data collected from over 80 users, the authors conduct a series of experiments to validate the e cacy of their proposed system. 

Due to having a much smaller screen and keyboard on a smartphone than the traditional user input/output devices, PIN-based and pattern-based passcode systems have been widely used in smartphones for user authentication. 

Researchers have also suggested the use of graphical passwords as an easier alternative to text-based passwords [7, 17], based on the idea that people have a better ability to recall images than texts. 

Unobtrusive methods for authentication on mobile smart phones have emerged as an alternative to typed passwords, such as gait biometrics (achieving an EER of 20.1%) [13, 26], or the unique movement users perform when answering or placing a phone call (EER being between 4.5% and 9.5%) [11]. 

Based on the feature data, the authors observe that each user demonstrates consistent and unique tapping behaviors, which can be utilized for di↵erentiating itself from other users. 

They utilized features from device orientation data on an HTC Evo 4G smartphone, and correctly inferred more than 70% of the keys typed on a number-only soft keyboard. 

given the same mobile device, behavioral biometrics tend to be less sensitive to the surrounding environmental factors like darkness or noise. 

In the testing phase, given an unknown sample as n-dimensional feature vector XQ , its distance from each of the N feature vectors in the enrollment phase is calculated as:d(X Q , X i ) = nXj=1kX Q,j X i,j k j , i = 1, ..., N, (2)where X Q,j is the jth feature of feature vector X Q , and X i,j is the jth feature of the ith feature vector in the enrollment phase. 

Using the dissimilarity score between two feature vectors, the authors further verify if their extracted features of a user remain relatively stable over multiple repetitions, in comparison with those of the other participants.