Predicting sex as a soft-biometrics from device interaction swipe gestures

Oscar Miguel-Hurtado, Sarah Stevenage, Chris Bevan, Richard Guest

Research output: Contribution to journalArticlepeer-review

31 Citations (SciVal)


Touch and multi-touch gestures are becoming the most common way to interact with technology such as smart phones, tablets and other mobile devices. The latest touch-screen input capacities have tremendously increased the quantity and quality of available gesture data, which has led to the exploration of its use in multiple disciplines from psychology to biometrics. Following research studies undertaken in similar modalities such as keystroke and mouse usage biometrics, the present work proposes the use of swipe gesture data for the prediction of soft-biometrics, specifically the user's sex. This paper details the software and protocol used for the data collection, the feature set extracted and subsequent machine learning analysis. Within this analysis, the BestFirst feature selection technique and classification algorithms (naïve Bayes, logistic regression, support vector machine and decision tree) have been tested. The results of this exploratory analysis have confirmed the possibility of sex prediction from the swipe gesture data, obtaining an encouraging 78% accuracy rate using swipe gesture data from two different directions. These results will hopefully encourage further research in this area, where the prediction of soft-biometrics traits from swipe gesture data can play an important role in enhancing the authentication processes based on touch-screen devices.
Original languageEnglish
Pages (from-to)44-51
JournalPattern Recognition Letters
Early online date17 May 2016
Publication statusPublished - 1 Aug 2016


Dive into the research topics of 'Predicting sex as a soft-biometrics from device interaction swipe gestures'. Together they form a unique fingerprint.

Cite this