Facial Emotion Recognition
Total Page:16
File Type:pdf, Size:1020Kb
Issue 1 | 2021 Facial Emotion Recognition Facial Emotion Recognition (FER) is the technology that analyses facial expressions from both static images and videos in order to reveal information on one’s emotional state. The complexity of facial expressions, the potential use of the technology in any context, and the involvement of new technologies such as artificial intelligence raise significant privacy risks. I. What is Facial Emotion FER analysis comprises three steps: a) face de- Recognition? tection, b) facial expression detection, c) ex- pression classification to an emotional state Facial Emotion Recognition is a technology used (Figure 1). Emotion detection is based on the ana- for analysing sentiments by different sources, lysis of facial landmark positions (e.g. end of nose, such as pictures and videos. It belongs to the eyebrows). Furthermore, in videos, changes in those family of technologies often referred to as ‘affective positions are also analysed, in order to identify con- computing’, a multidisciplinary field of research on tractions in a group of facial muscles (Ko 2018). De- computer’s capabilities to recognise and interpret pending on the algorithm, facial expressions can be human emotions and affective states and it often classified to basic emotions (e.g. anger, disgust, fear, builds on Artificial Intelligence technologies. joy, sadness, and surprise) or compound emotions Facial expressions are forms of non-verbal (e.g. happily sad, happily surprised, happily disgus- communication, providing hints for human emo- ted, sadly fearful, sadly angry, sadly surprised) (Du tions. For decades, decoding such emotion expres- et al. 2014). In other cases, facial expressions could sions has been a research interest in the field of psy- be linked to physiological or mental state of mind chology (Ekman and Friesen 2003; Lang et al. 1993) (e.g. tiredness or boredom). but also to the Human Computer Interaction field The source of the images or videos serving asin- (Cowie et al. 2001; Abdat et al. 2011). Recently, the put to FER algorithms vary from surveillance cam- high diffusion of cameras and the technological ad- eras to cameras placed close to advertising screens vances in biometrics analysis, machine learning and in stores as well as on social media and streaming pattern recognition have played a prominent role in services or own personal devices. the development of the FER technology. FER can also be combined with biometric iden- Many companies, ranging from tech giants such tification. Its accuracy can be improved with tech- as NEC or Google to smaller ones, such as Affectiva nology analysing different types of sources such as or Eyeris invest in the technology, which shows its growing importance. There are also several EU re- prove automated understanding of human interactive be- havior in naturalistic contexts; iBorderCtrl have devised search and innovation program Horizon2020 a system for automated border security, which includes 1 initiatives exploring the use of the technology. FER technology; PReDicT project utilises FER in the med- ical domain, to improve the outcome of antidepressant 1EU-funded Horizon 2020 Project SEWA uses FER to im- treatments. EDPS TechDispatch on Facial Emotion Recognition 1 Education • monitor students’ attention • detect emotional reaction of users to an educat- ive program and adapt the learning path • design affective tutoring system • detect engagement in online learning Figure 1: Steps of Facial Emotion Recognition Public safety • lie detectors and smart border control voice, text, health data from sensors or blood flow • predictive screening of public spaces to patterns inferred from the image. identify emotions triggering potential terror- Potential uses of FER cover a wide range of ism threat applications, examples of which are listed here be- • analysing footage from crime scenes to indic- low in groups by their application field. ate potential motives in a crime Provision of personalised services Crime detection • analyse emotions to display personalised mes- • detect and reduce fraudulent insurance claims sages in smart environments • deploy fraud prevention strategies • provide personalised recommendations e.g. on • spot shoplifters music selection or cultural material • analyse facial expressions to predict individual Other reaction to movies • driver fatigue detection • detection of political attitudes Customer behaviour analysis and advertising • analyse customers’ emotions while shopping II. What are the data protection focused on either goods or their arrangement issues? within the shop • advertising signage at a railway station using Due to its use of biometric data and Artificial In- a system of recognition and facial tracking for telligence technologies, FER shares some of the marketing purposes risks of using facial recognition and artificial intelli- gence. Nevertheless, this technology carries also its own specific risks. Being a biometrics technology, Healthcare where aiming at identification does not appear asa • detect autism or neurodegenerative diseases primary goal, risks related to emotion interpretation • predict psychotic disorders or depression to accuracy and its application are eminent. identify users in need of assistance • suicide prevention II.1. Necessity and proportionality • detect depression in elderly people Turning human expressions into a data source to in- • observe patients conditions during treatment fer emotions touches clearly a part of peoples’ most private data. Being a disruptive technology, FER Employment raises important issues regarding necessity and pro- portionality. • help decision-making of recruiters It has to be carefully assessed, whether deploying • identify uninterested candidates in a job inter- FER is indeed necessary for achieving the pur- view sued objectives or whether there is a less in- • monitor moods and attention of employees trusive alternative. There is risk of applying FER EDPS TechDispatch on Facial Emotion Recognition 2 without performing necessity and proportionality Choosing the right dataset that is representat- evaluation for each single each case, misled bythe ive is crucial for avoiding discrimination. If the decision to use the technology in a different context. training data is not diverse enough, the technology However proportionality depends on many factors, might be biased against underrepresented popula- such as the type of collected data, the type ofin- tion. Discrimination triggered by faulty database ferences, data retention period, or potential further or by errors in detecting the correct emotional state processing. may have serious effects, e.g. inability to use certain services. II.2. Data accuracy In another aspect of the same problem, in case of medical conditions or physical impairments Analysis of emotions based on facial expressions in which temporary or permanent paralysis of facial may not be accurate, as facial expressions can muscles occurs, data subjects’ emotions may be mis- slightly vary among individuals, may mix different understood by algorithms. This may result in a wide emotional states experienced at the same time (e.g. range of situations of misclassification, with im- fear and anger, happy and sad) or may not express pact ranging from receiving unwished services up an emotion at all. On the other hand, there are to misdiagnosis of having a psychological disorder. emotions that may not be expressed on someone’s face, thus inference based solely on facial expres- II.4. Transparency and control sion may lead to wrong impressions. Additional factors can add to the ambiguity of the facial ex- Facial images and video can be captured anywhere, pressions, such as contextual clauses (sarcasm), and thanks to the ubiquity and small size of cameras. socio-cultural context. In addition, technical aspects Surveillance cameras in public spaces or stores are (different angles of the camera, lighting conditions not the only cameras remotely capturing facial im- and masking several parts of the face) can affect the ages as one’s own mobile devices can capture ex- quality of a captured facial expression. pressions during their use. In these situations, trans- Furthermore, even in the case of accurate recog- parency issues arise concerning both the collection nition of emotions, the use of the results may lead and the further processing of personal data. to wrong inferences about a person, as FER does Where the data subjects’ facial expressions are not explain the trigger of emotions, which may be a captured in a remote manner, it may not be clear thought of a recent or past event. However, the res- to them which system or application will process ults of FER, regardless of accuracy limitations, are their data, for which purposes, and who the con- usually treated as facts and are input to processes trollers are. As a result, they would not be in the affecting a data subject’s life, instead of triggering position to freely give consent or exercise control an evaluation to discover more about their situation over the processing of their personal data, includ- in the specific context. ing sharing with third-parties. Where data subjects are not provided with accurate information, access and control over the use of FER, they are deprived II.3. Fairness of their freedom to select which aspects of their life The accuracy of the facial emotion algorithm results can be used to affect other contexts (e.g. emotions can play an important role in discriminating on in social interactions could