Continuous Authentication Using Eye Movement Response of Implicit Visual Stimuli
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Article number | 177 |
Pages (from-to) | 1-22 |
Number of pages | 22 |
Journal / Publication | Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies |
Volume | 1 |
Issue number | 4 |
Publication status | Published - Dec 2017 |
Externally published | Yes |
Link(s)
DOI | DOI |
---|---|
Permanent Link | https://scholars.cityu.edu.hk/en/publications/publication(6ed0e902-dd6e-4c72-b7b4-b12fc5c3a386).html |
Abstract
Smart head-worn or head-mounted devices, including smart glasses and Virtual Reality (VR) headsets, are gaining popularity. Online shopping and in-app purchase from such headsets are presenting new e-commerce opportunities to the app developers. For convenience, users of these headsets may store account login, bank account and credit card details in order to perform quick in-app purchases. If the device is unattended, then an attacker, which can include insiders, can make use of the stored account and banking details to perform their own in-app purchases at the expense of the legitimate owner. To better protect the legitimate users of VR headsets (or head mounted displays in general) from such threats, in this paper, we propose to use eye movement to continuously authenticate the current wearer of the VR headset. We built a prototype device which allows us to apply visual stimuli to the wearer and to video the eye movements of the wearer at the same time. We use implicit visual stimuli (the contents of existing apps) which evoke eye movements from the headset wearer but without distracting them from their normal activities. This is so that we can continuously authenticate the wearer without them being aware of the authentication running in the background. We evaluated our proposed system experimentally with 30 subjects. Our results showed that the achievable authentication accuracy for implicit visual stimuli is comparable to that of using explicit visual stimuli. We also tested the time stability of our proposed method by collecting eye movement data on two different days that are two weeks apart. Our authentication method achieved an Equal Error Rate of 6.9% (resp. 9.7%) if data collected from the same day (resp. two weeks apart) were used for testing. In addition, we considered active impersonation attacks where attackers trying to imitate legitimate users' eye movements. We found that for a simple (resp. complex) eye tracking scene, a successful attack could be realised after on average 5.67 (13.50) attempts and our proposed authentication algorithm gave a false acceptance rate of 14.17% (3.61%). These results show that active impersonating attacks can be prevented using complex scenes and an appropriate limit on the number of authentication attempts. Lastly, we carried out a survey to study the user acceptability to our proposed implicit stimuli. We found that on a 5-point Likert scale, at least 60% of the respondents either agreed or strongly agreed that our proposed implicit stimuli were non-intrusive.
Research Area(s)
- Eye movement, biometrics, continuous authentication, account takeover, insider threat
Citation Format(s)
Continuous Authentication Using Eye Movement Response of Implicit Visual Stimuli. / ZHANG, Yongtuo; HU, Wen; XU, Weitao et al.
In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 1, No. 4, 177, 12.2017, p. 1-22.
In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 1, No. 4, 177, 12.2017, p. 1-22.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review