Robust User Identification Based on Facial Action Units Unaffected By
Total Page:16
File Type:pdf, Size:1020Kb
Proceedings of the 51st Hawaii International Conference on System Sciences j 2018 Robust user identification based on facial action units unaffected by users’ emotions Ricardo Buettner Aalen University, Germany [email protected] Abstract—We report on promising results concerning the iden- we combine the biometric capture with a PIN request, which tification of a user just based on its facial action units. The is a rule in access validation systems [2,3], we reached a related Random Forests classifier which analyzed facial action very good false positive rate of only 7.633284e-06 (over unit activity captured by an ordinary webcam achieved very 99.999 percent specificity). good values for accuracy (97.24 percent) and specificity (99.92 On the basis of our results we can offer some interest- percent). In combination with a PIN request the degree of ing theoretical insights, e.g. which facial action units are specificity raised to over 99.999 percent. The proposed bio- the most predictive for user identification such as the lid metrical method is unaffected by a user’s emotions, easy to tightener (orbicularis oculi, pars palpebralis) and the upper use, cost efficient, non-invasive, and contact-free and can be lip raiser (levator labii superioris). In addition our work used in human-machine interaction as well as in secure access has practical implications as the proposed contactless user control systems. identification mechanism can be applied as a comfortable way to continuously recognize peo- 1. Introduction ple who are present in human-computer interaction settings, and Robust user identification is a precondition of modern an additional authentication mechanism in PIN entry human-computer interaction systems, in particular of au- systems. tonomous robots, not only for issues of security but also for convenience [1,2]. The most important findings from these analyses are: While biometric user identification by fingerprints or pulse response is not contactless and photography-based 1) It is possible to identify a person based on its facial approaches (iris, retinal, and face scans) can be tricked action units alone with an accuracy of 97.2 percent. by previously captured images [2]–[4], we investigate the 2) This user identification mechanism is unaffected by possibility of identifying a person using its specific facial the user’s emotions. expressions measured by the amount of facial action units. 3) Important for secure access control, the specificity Since these facial action units are not only user-specific (true negative rate) is 99.92 percent. [5] but also emotion-specific [6] and emotions substantially 4) In combination with a PIN request the degree of affect a broad range of user behavior e.g. mouse trajectories specificity raised to over 99.999 percent. [7], the particular difficulty is the development of a robust 5) The most important facial muscles for robust user user identification method unaffected by a user’s emotions. identification are the orbicularis oculi (pars palpe- That is why in this paper we evaluate the possibility of bralis), levator labii superioris, and orbicularis reliably identifying a person based on its facial action units oris. and unaffected by its emotions. The research question is: “Can we robustly identify a The paper is organized as follows: Next we present an user based on its facial action units and unaffected against overview of the research background on facial expressions its emotions?” and the facial action coding system before providing the In order to identify the correct user we made use of research methodology, including experimental procedure, a Random Forests decision tree classifier. The classifier stimuli, sample characteristics, measurements, data prepa- uses neurophysiological data from a controlled laboratory ration, and the Random Forests decision tree method. After experiment in which we continuously recorded facial action that we present the machine learning results concerning the units while evoking specific emotions by displaying 14 performance evaluation and analysis of important specific normative emotional stimuli from the International Affective facial action units and related facial muscles. We then dis- Picture System (IAPS) [8]. cuss the results and include theoretical and practical impli- As a result we achieved a balanced prediction accuracy cations, before concluding with limitations and suggestions of 97.2 percent just based on the facial expressions. When for future research. URI: http://hdl.handle.net/10125/49923 Page 265 ISBN: 978-0-9981331-1-9 (CC BY-NC-ND 4.0) 2. Research background on facial expressions and the facial action coding system The identification of a human by another human through the analysis of the others’ face and its facial expressions is part of a long evolution over several phylogenetic [9] and Figure 3: Activation of action unit 2: Outer ontogenetic hominid evolution stages [10]. Facial expression portion of the brows is raised. From [14]. analysis is a key concept of communication and social PLATE 547 Face: Superficial Muscles (Anterior View) competence [9,11]. user emotions [15]. For instance, while happiness is related Galea aponeurotica to the combination of action units 6 and 12, contempt is Depressor supercilii muscle Procerus muscle related to action unit 14 [16]. Nasal bone Corrugator supercilii muscle Frontalis belly, occipitofrontalis muscle Medial palpebral ligament Levator labii superioris alaeque nasi muscle Nasalis muscle Orbicularis oculi muscle (orbital portion) Temporoparietalis muscle Levator labii superioris muscle Orbicularis oculi muscle (palpebral part) Zygomaticus minor muscle Figure 4: Cheeks are raised (action unit 6). From [14]. Orbicularis oculi Zygomaticus major muscle (orbital part) muscle Levator labii superioris alaeque nasi muscle 3. Methodology Zygomaticus minor Depressor septi muscle muscle Levator labii superioris Levator anguli oris muscle muscle Parotid duct; buccal fat pad Zygomaticus major muscle In order to clearly contribute to NeuroIS research and Parotid gland Levator anguli oris muscle show strong methodological rigor, we followed the NeuroIS Buccinator muscle Orbicularis oris muscle Masseter muscle (superficial part) guidelines provided by vom Brocke and Tiang [17]. To Risorius muscle Orbicularis oris muscle base the experimental design adequately on solid research in Platysma muscle Platysma muscle Mental foramen related fields of neuro-science we reviewed the fundamental Depressor anguli oris muscle anatomical mechanism of the relationship between specific Depressor labii inferioris muscle facial expressions, their related facial action units and fa- Sternocleidomastoid muscle Platysma muscle cial muscles [6,13]. The methodology uses camera-based External (investing) layer, cervical fascia facial action unit analysis as a well-established approach Mentalis muscle Depressor labii inferioris muscle in physiology and psychology [5,14,15]. With this method, Depressor anguli oris muscle bio-data (i.e. facial action units related to specific facial FIGURE 547 Muscles ofFigure Facial Expression 1: Muscles(Anterior View) of facial expres- NOTE: (1) The muscles of facial expression are located within the layers of superficial fascia. Having developed from the mesoderm of the second muscles) can be used to better identify the correct user (cf. branchial arch, they are innervated by the nerve of that arch, the seventh cranial or facial nerve. (2) Facial muscles may besion grouped into: (anterior (a) muscles of the view).scalp, (b) muscles From of the external [ 12ear, (c), muscles p. 580]. of the eyelid, (d) the nasal muscles, and (e) the oral muscles. The borders of some facial muscles are not easily defined. The platysma muscle also belongs to the facial group, guideline 4 of [17]). In comparison to other neuroscience although it extends over the neck. (3) The circular muscles surrounding the eyes (orbicularis oculi) and the mouth (orbicularis oris) assist in closure of the orbital and oral aper- tools, camera-based facial action unit analysis is a contact- turesAny and thus contribute human to functionsbeing’s such as closing the eyes facial and the ingestion expression of liquids and food. can be broken (4) Since facial muscles respond to thoughts and emotions, they aid in communication. free and efficient method of choice. We further applied the down(5) The buccinator into muscles smallerare flat and are situated facial on the lateral actions, aspects of the oral cavity. so-called They assist in mastication facial by pressing action the cheeks against the teeth, preventing food from accumulating in the oral vestibule. guidelines and standards from the Noldus FaceReader 6 units, e.g. the raising of the inner brow (action unit 1) or manual. outerChapter 7 The Neck brow and Head (action unit 2). The facial action coding system (FACS) by Ekman et al. [6,13] is a system that describes all observable facial movements anatomically based on the 3.1. Experimental procedure LLWBK429_c07_p531-579.inddWBK429_c07_pcontraction531-579.indd 556464 of specific facial muscles (see figure1), e.g. 111/11/091/11/09 55:31:45:31:45 PPMM zygomaticus major muscle for the lip corner puller (action We chose a one group design (within-subject, proven unit 12). emotional stimuli from the International Affective Picture System (IAPS) [8] as treatment (see table1), completely randomized, double-blind, cf. [18]). Welcome 14 randomized IAPS pictures Demographics SNAKE (including perceptive rating) age, Figure 2: Activation of action unit 1: