<<

Issue 1 | 2021

Facial Recognition Facial Emotion Recognition (FER) is the technology that analyses facial expressions from both static images and videos in order to reveal information on one’s emotional state. The complexity of facial expressions, the potential use of the technology in any context, and the involvement of new technologies such as artificial raise significant privacy risks.

I. What is Facial Emotion FER analysis comprises three steps: a) face de- Recognition? tection, b) detection, c) ex- pression classification to an emotional state Facial Emotion Recognition is a technology used (Figure 1). Emotion detection is based on the ana- for analysing sentiments by different sources, lysis of facial landmark positions (e.g. end of nose, such as pictures and videos. It belongs to the eyebrows). Furthermore, in videos, changes in those family of technologies often referred to as ‘affective positions are also analysed, in order to identify con- computing’, a multidisciplinary field of research on tractions in a group of facial muscles (Ko 2018). De- computer’s capabilities to recognise and interpret pending on the algorithm, facial expressions can be human and affective states and it often classified to basic emotions (e.g. , , , builds on technologies. , , and ) or compound emotions Facial expressions are forms of non-verbal (e.g. happily sad, happily surprised, happily disgus- communication, providing hints for human emo- ted, sadly fearful, sadly angry, sadly surprised) (Du tions. For decades, decoding such emotion expres- et al. 2014). In other cases, facial expressions could sions has been a research in the field of psy- be linked to physiological or mental state of mind chology (Ekman and Friesen 2003; Lang et al. 1993) (e.g. tiredness or ). but also to the Human Computer Interaction field The source of the images or videos serving asin- (Cowie et al. 2001; Abdat et al. 2011). Recently, the put to FER algorithms vary from surveillance cam- high diffusion of cameras and the technological ad- eras to cameras placed close to advertising screens vances in analysis, and in stores as well as on social media and streaming pattern recognition have played a prominent role in services or own personal devices. the development of the FER technology. FER can also be combined with biometric iden- Many companies, ranging from tech giants such tification. Its accuracy can be improved with tech- as NEC or Google to smaller ones, such as Affectiva nology analysing different types of sources such as or Eyeris invest in the technology, which shows its growing importance. There are also several EU re- prove automated understanding of human interactive be- havior in naturalistic contexts; iBorderCtrl have devised search and innovation program Horizon2020 a system for automated border security, which includes 1 initiatives exploring the use of the technology. FER technology; PReDicT project utilises FER in the med- ical domain, to improve the outcome of antidepressant 1EU-funded Horizon 2020 Project SEWA uses FER to im- treatments.

EDPS TechDispatch on Facial Emotion Recognition 1 Education

• monitor students’ • detect emotional reaction of users to an educat- ive program and adapt the learning path • design affective tutoring system • detect engagement in online learning

Figure 1: Steps of Facial Emotion Recognition Public safety

• lie detectors and smart border control voice, text, health data from sensors or blood • predictive screening of public spaces to patterns inferred from the image. identify emotions triggering potential terror- Potential uses of FER cover a wide range of ism threat applications, examples of which are listed here be- • analysing footage from crime scenes to indic- low in groups by their application field. ate potential motives in a crime

Provision of personalised services Crime detection

• analyse emotions to display personalised mes- • detect and reduce fraudulent insurance claims sages in smart environments • deploy fraud prevention strategies • provide personalised recommendations e.g. on • spot shoplifters music selection or cultural material • analyse facial expressions to predict individual Other reaction to movies • driver fatigue detection • detection of political attitudes Customer behaviour analysis and advertising

• analyse customers’ emotions while shopping II. What are the data protection focused on either goods or their arrangement issues? within the shop • advertising signage at a railway station using Due to its use of biometric data and Artificial In- a system of recognition and facial tracking for telligence technologies, FER shares some of the purposes risks of using facial recognition and artificial intelli- gence. Nevertheless, this technology carries also its own specific risks. Being a biometrics technology, Healthcare where aiming at identification does not appear asa • detect autism or neurodegenerative diseases primary goal, risks related to emotion interpretation • predict psychotic disorders or to accuracy and its application are eminent. identify users in need of assistance • suicide prevention II.1. Necessity and proportionality • detect depression in elderly people Turning human expressions into a data source to in- • observe patients conditions during treatment fer emotions touches clearly a part of peoples’ most private data. Being a disruptive technology, FER Employment raises important issues regarding necessity and pro- portionality. • help decision-making of recruiters It has to be carefully assessed, whether deploying • identify uninterested candidates in a job inter- FER is indeed necessary for achieving the pur- view sued objectives or whether there is a less in- • monitor moods and attention of employees trusive alternative. There is risk of applying FER

EDPS TechDispatch on Facial Emotion Recognition 2 without performing necessity and proportionality Choosing the right dataset that is representat- evaluation for each single each case, misled bythe ive is crucial for avoiding discrimination. If the decision to use the technology in a different context. training data is not diverse enough, the technology However proportionality depends on many factors, might be biased against underrepresented popula- such as the type of collected data, the type ofin- tion. Discrimination triggered by faulty database ferences, data retention period, or potential further or by errors in detecting the correct emotional state processing. may have serious effects, e.g. inability to use certain services. II.2. Data accuracy In another aspect of the same problem, in case of medical conditions or physical impairments Analysis of emotions based on facial expressions in which temporary or permanent paralysis of facial may not be accurate, as facial expressions can muscles occurs, data subjects’ emotions may be mis- slightly vary among individuals, may mix different understood by algorithms. This may result in a wide emotional states experienced at the same time (e.g. range of situations of misclassification, with im- fear and anger, happy and sad) or may not express pact ranging from receiving unwished services up an emotion at all. On the other hand, there are to misdiagnosis of having a psychological disorder. emotions that may not be expressed on someone’s face, thus inference based solely on facial expres- II.4. Transparency and control sion may lead to wrong impressions. Additional factors can add to the ambiguity of the facial ex- Facial images and video can be captured anywhere, pressions, such as contextual clauses (sarcasm), and thanks to the ubiquity and small size of cameras. socio-cultural context. In addition, technical aspects Surveillance cameras in public spaces or stores are (different angles of the camera, lighting conditions not the only cameras remotely capturing facial im- and masking several parts of the face) can the ages as one’s own mobile devices can capture ex- quality of a captured facial expression. pressions during their use. In these situations, trans- Furthermore, even in the case of accurate recog- parency issues arise concerning both the collection nition of emotions, the use of the results may lead and the further processing of personal data. to wrong inferences about a person, as FER does Where the data subjects’ facial expressions are not explain the trigger of emotions, which may be a captured in a remote manner, it may not be clear thought of a recent or past event. However, the res- to them which system or application will process ults of FER, regardless of accuracy limitations, are their data, for which purposes, and who the con- usually treated as facts and are input to processes trollers are. As a result, they would not be in the affecting a data subject’s life, instead of triggering position to freely give consent or exercise control an evaluation to discover more about their situation over the processing of their personal data, includ- in the specific context. ing sharing with third-parties. Where data subjects are not provided with accurate information, access and control over the use of FER, they are deprived II.3. Fairness of their freedom to select which aspects of their life The accuracy of the facial emotion algorithm results can be used to affect other contexts (e.g. emotions can play an important role in discriminating on in social interactions could be used in the context of grounds of skin colour or ethnic origin. Societal recruitment). Moreover, data subjects need to con- norms and cultural differences have been found to trol which periods of time their captured data willbe influence the level of expression of some emotions processed and aggregated to history records of their while some algorithms have been found to be biased emotional situation, as emotion inferences may not against several groups, based on skin colour. For be valid for them after a period of time. instance, a study testing algorithms of facial emo- Another consequence of the remote capture of tion recognition revealed they assigned more negat- facial expressions and the obscurity of their pro- ive emotions (anger) to faces of persons of African cessing is that data subjects might not be provided descent than to other faces. Furthermore, whenever with information on which other sources of data there was ambiguity, the former were scored as an- these will be aggregated to. Also, advanced AI grier (Rhue, 2018). algorithms add to the complexity of transparency

EDPS TechDispatch on Facial Emotion Recognition 3 needs, as they may detect slight movements of facial vulnerable emotional state, can be used to mentally muscle that are unconscious even for the individu- force people to perform actions they would not do als. This would contribute to the unpleasant otherwise – e.g. to buy goods they do not need. of vulnerability due to unwanted exposure. FER technology could be used for purposes of safeguarding public security, for instance at con- II.5. Processing of special categories of certs, sport events or airports, to quickly identify personal data signs of aggression and stress and identify potential terrorists. However, if such an identification was FER technology can detect the existence, changes based solely on FER and was not combined with or total lack of facial expressions, and link thisto other actions or triggers that this person is danger- an emotional state. As a result, in some contexts, ous, this could introduce further risks for the data algorithms may infer special categories of per- subjects. For instance, a person could be subject sonal data, such as political opinions or health data. to unjustified delays to perform further secur- For instance, applying FER technology at political ity checks or investigations, causing them to miss events, political attitudes can be inferred by looking participation in an event, boarding on a flight or at facial expressions and reactions of the audience. even lead to unjustified arrest. Also, by the lack of facial expressions, algorithms Last but not least, FER can influence behavi- are able to detect signs of alexithymia, a state in oural changes in case a person is aware of the ex- which one cannot understand the they ex- posure to this technology (known as Reactivity in perience or lack the words to describe these feelings. ). Individuals may alter their habits or This finding can be linked to severe psychiatric and avoid specific areas where the technology is applied neurological disorders, such as psychosis. Further- in an attempt to self-sensor and protect themselves. more, analysis of historical data on one’s emotional One can imagine the chilling effect this could have state may reveal other health conditions such as de- to a society and the feeling of insecurity among cit- pression. Such data, if used in the context of health- izens, if such a technology were to be used by non- care, could assist in prediction and timely treatment democratic governments, to infer political attitude of a patient. However, where data subjects are not of citizens. able to control the flow of derived information and its use in other contexts, they may face a situation of III. Recommended Reading inference and use of such sensitive personal data by non-authorised entities, such as employers or Abdat, F. et al. (2011). Human-Computer Interaction insurance companies. Using Emotion Recognition from Facial Expression. In: 2011 UKSim 5th European Symposium on Com- II.6. Profiling and automated puter Modeling and Simulation. IEEE. doi: 10 . decision-making 1109/ems.2011.20. Andalibi, Nazanin and Justin Buss (2020). The Hu- FER technology can be further used to create profiles man in Emotion Recognition on Social Media: At- of people in a number of situations. It could be used titudes, Outcomes, Risks. In: Proceedings of the to derive one’s of a product, an advertise- 2020 CHI Conference on Human Factors in Comput- ment or a proposed idea. It can also be used for clas- ing Systems. CHI ’20. Honolulu, HI, USA: Associ- sifying productivity and fatigue-resistance in work- ation for Computing Machinery, pp. 1–16. isbn: places. The risk lies in the fact that the data sub- 9781450367080. doi: 10.1145/3313831.3376680. ject may not be aware of this type of targeting and Barrett, Lisa Feldman et al. (2019). Emotional Expres- might feel uncomfortable if they found out about sions Reconsidered: Challenges to Inferring Emo- it. Further implications can occur by erroneous tion From Human Facial Movements. In: Psy- profiling or inferences solely based on the associ- chological Science in the Public Interest 20.1. ation with a certain group of people experiencing PMID: 31313636, pp. 1–68. doi: 10 . 1177 / the same emotions. 1529100619832930. In addition, the knowledge of the individuals’ emotions can make it easier to manipulate them. For instance, the knowledge of emotions revealing a

EDPS TechDispatch on Facial Emotion Recognition 4 Cowie, R. et al. (2001). Emotion recognition in human- This publication is a brief report produced by the Techno- computer interaction. In: IEEE logy and Privacy Unit of the European Data Protection Su- Magazine 18.1, pp. 32–80. doi: 10.1109/79.911197. pervisor (EDPS). It aims to provide a factual description of Crawford, K. et al. (2919). AI Now 2019 Report. Tech. an emerging technology and discuss its possible impacts rep. New York: AI Now Institute. on privacy and the protection of personal data. The con- tents of this publication do not imply a policy position of Daily, Shaundra B. et al. (2017). Affective Comput- the EDPS. ing: Historical Foundations, Current Applications, and Future Trends. In: Emotions and Affect in Hu- Issue Author: Konstantina VEMOU, Anna HORVATH man Factors and Human-Computer Interaction. El- Editor: Thomas ZERDICK sevier, pp. 213–231. doi: 10 . 1016 / b978 - 0 - 12 - Contact: [email protected] 801851-4.00009-4. Du, Shichuan et al. (2014). Compound facial expres- To subscribe or unsubscribe to the EDPS TechDis- patch publications, please send a mail to techdis- sions of emotion. In: Proceedings of the National [email protected]. The data protection notice is on- Academy of Sciences 111.15, E1454–E1462. issn: line on the EDPS website. 0027-8424. doi: 10.1073/pnas.1322355111. eprint: https://www.pnas.org/content/111/15/E1454.full. © European Union, 2021. Except otherwise noted, the re- use of this document is authorised under a Creative Com- pdf. mons Attribution 4.0 International License (CC BY 4.0). Ekman, Paul and Wallace V Friesen (2003). Unmask- This means that reuse is allowed provided appropriate ing the face: A guide to recognizing emotions from credit is given and any changes made are indicated. For facial clues. Ishk. any use or reproduction of photos or other material that is not owned by the European Union, permission must be Jacintha, V et al. (2019). A Review on Facial Emotion sought directly from the copyright holders. Recognition Techniques. In: 2019 International Con- ference on Communication and Signal Processing ISSN2599-932X (ICCSP). IEEE, pp. 0517–0521. doi: 10.1109/ICCSP. HTML: ISBN978-92-9242-473-2 QT-AD-21-001-EN-Q 2019.8698067. doi:10.2804/519064 Ko, Byoung Chul (2018). A brief review of facial emo- PDF: ISBN978-92-9242-472-5 tion recognition based on visual information. In: QT-AD-21-001-EN-N Sensors 18.2, p. 401. doi: 10.3390/s18020401. doi:10.2804/014217 Lang, Peter J. et al. (1993). Looking at pictures: Af- fective, facial, visceral, and behavioral reactions. In: 30.3, pp. 261–273. doi: 10.1111/ j.1469-8986.1993.tb03352.x. Rhue, Lauren (2018). Racial Influence on Auto- mated of Emotions. In: SSRN Electronic Journal. doi: 10.2139/ssrn.3281765. Russell, James A. (1995). Facial expressions of emo- tion: What lies beyond minimal universality? In: Psychological Bulletin 118.3, pp. 379–391. doi: 10. 1037/0033-2909.118.3.379. Sedenberg, Elaine and John Chuang (2017). Smile for the Camera: Privacy and Policy Implications of Emotion AI. arXiv: 1709.00396 [cs.CY].

EDPS TechDispatch on Facial Emotion Recognition 5