2. BACKGROUND
The tapping behaviors of individual users on touchscreen
vary from person to person due to di↵erences in hand ge-
ometry and finger agility. Each user has a unique personal
tapping pattern, reflected on the di↵erent rhythm, strength,
and angle preferences of the applied force. As our tapping-
b ehavior-based approach verifies the owner of a smartphone
based on“who you are”– your physical and behavioral traits,
instead of “what you know”, it belongs to biometrics-based
user authentication. In general, a biometrics authentica-
tion system authenticates users either by their physiological
traits like faces and voices [5, 21] or behavioral patterns like
finger typing and hand movements [24, 34].
While physiological traits can achieve high accuracy in the
pro cess of user authentication, they have not been widely
used in mobile devices. Recent studies have also shown that
the physiology-based mechanisms deployed in mobile devices
are sensitive to certain environmental factors, which could
significantly diminish their accuracy and reliability. For ex-
ample, face recognition may fail due to a di↵erent viewing
angle and poor illumination [28], and voice recognition de-
grades due to background noise [5]. However, given the same
mobile device, behavioral biometrics tend to be less sensi-
tive to the surrounding environmental factors like darkness
or noise.
Exploiting the behavioral information captured by mul-
tiple sensors on a smartphone, we can exclusively create a
detailed user profile for verifying the owner of the smart-
phone. Since our approach works seamlessly with the exist-
ing passcode-based user authentication mechanisms in mo-
bile devices, it plays a role of implicit authentication. In
other words, our approach can act as a second factor authen-
tication method and supplement the passcode systems for
stronger authentication in a cost-e↵ective and user-transparent
manner. More recently, seminal works have been proposed
to explore the feasibility of user verification employing the
b ehaviors of pattern-based passwords [12]. However, the
false reject rate (FRR) of their work is rather high, which
means there is a high chance that the owner of a mobile
device would be mistakenly regarded as an impostor and be
blo cked from accessing the device.
3. RELATED WORKS
A) Keystroke Dynamics and Graphical Passwords. Keystroke
dynamics has been extensively studied in distinguishing users
by the way they type their personal identification number
(PIN) based passwords [19]. Research done on the analysis
of keystroke dynamics for identifying users as they type on
a mobile phone can be found in [3, 9, 10, 18, 25, 33]. Clarke
et al. [10] considered the dynamics of typing 4-digit PIN
code s, in which the researchers achieve an average Equal
Error Rate (ERR) of 8.5% on physical keyboard on a Nokia
5110 handset. Zahid et al. [33] examined this approach on
touchscreen keyboards and achieve, in one best scenario, a
low Equal Error Rate of approximately 2% with training set
required a minimum of 250 keystrokes.
Researchers have also suggested the use of graphical pass-
words as an easier alternative to text-based passwords [7,
17], based on the idea that people have a better ability to
recall images than texts. A good overview of popular graph-
ical password schemes has been reported in [4]. Chang et
al. [7] proposed a graphics-based password KDA system
for touchscreen handheld mobile devices. The experiment
results show that EER is 12.2% in the graphics-based pass-
word KDA proposed system, and EER is reduced to 6.9%
when the pressure feature is used in the proposed system.
Di↵erent usabi lity studies have outlined the advantages of
graphical passwords, such as their reasonable login and cre-
ation times, acceptable error rates, good general perception
and reduced interference compared to text passwords, but
also their vulnerabilities [31].
B) Inferring Tapped Information from On-board Motion Sen-
sors. Several independent researches have found that sim-
ply by using data acquired by smartphone motion sensors,
it is sufficient to infer which part of the screen users tap
on [6, 23, 27, 32]. The firs t e↵ort was done by Cai et al.
in 2011 [6]. They utilized features from device orientation
data on an HTC Evo 4G smartphone, and correctly inferred
more than 70% of the keys typed on a number-only soft key-
b oard. Very soon, X u et al. further exploited more sensor
capabilities on smartphones, including accelerometer, gyro-
scop e, and orientation sensors [32]. Evaluation shows higher
accuracies of greater than 90% for inferring an 8-digit pass-
word wi thin 3 trials. Miluzzo et al. demonstrated another
key inference method on soft keyboard of both smartphones
and tablets [23]. 90% or higher accuracy is shown in iden-
tifying English letters on smartphones, and 80% on tablets.
Owusu et al. [27] infers taps o f keys and areas arranged in
a 60-region grid, solely based on accelerometer readings on
smartphones. Result showed that they are able to extract
6-character passwords in as few as 4.5 trials.
C) User Authentication by Their Behavior on Touch S creens.
Research has b een done in exploring di ↵erent biometric ap-
proaches for providing an extra level of security for authen-
ticating users into their mobile devices. Guerra-Casanova et
al. [15] proposed a biometric technique based on the idea
of authenticating a person on a mobile device by gesture
recognition, and achieve Equal Error Rate (EER) between
2.01% and 4.82% on a 100-users base. Unobtrusive methods
for authentication on mobile smart phones have emerged as
an alternative to typed passwords, such as gait biometrics
(achieving an EER of 20.1%) [13, 2 6], or the unique move-
ment users perform when answering or placing a phone call
(EER being between 4.5% and 9.5%) [11].
Very recently De Luca et al. [12] introduced an implicit
authentication approach that enhances password patterns
on android phones, with an additional security layer, which
is transparent to user. The application recorded all data
available from the touchscreen: pressure (how hard the fin-
ger presses), size (area of the finger touching the screen),
x and y coordinates, and time. Evaluation is based on 26
participants, with an average accuracy of 77%.
A latest work conducted by Sae-Bae et al. [29] makes use
of multi-touch screen sensor on iPad (not phone) to capture
the palm movement. They achieved a classification accuracy
of over 90%. However, palm movements is not suitable for
smartphone screens, since the screen is typically too small
for palm movements. Citty et al. [8] presented an alternative
approach to inputting PINs on small touchscreen devices. It
uses a sequence of 4 partitions of a selection of 16 images,
instead of 4- digits PINs, to increase the possible combina-
tion of authentication sequences. However, inputting the
sequence needs extra e↵orts in memorizing the images se-
2