The University of Sheffield

The University of Sheffield

THE UNIVERSITY OF SHEFFIELD CONTEXTUAL RECOGNITION OF ROBOT EMOTIONS JIAMING ZHANG March 2013 Submitted for the Degree of Doctor of Philosophy Department of Computer Science The University of Sheffield Co-Supervisors: Dr. Amanda Sharkey Prof. Noel Sharkey 1 Declaration I declare that this report is composed by myself and the work contained herein is my own except where explicitly stated otherwise in the text. This work has not been submitted for any other degree or professional qualification except as specified. Jiaming Zhang 2 Dedicated to my deceased family members Grandmother Mrs. Ruifang Luo and Grandfather Mr. Shanzhong Peng and My deceased high school classmate Mr. Zhoulong Zhang 3 Abstract In the field of human-robot interaction, socially interactive robots are often equipped with the ability to detect the affective states of users, the ability to express emotions through the use of synthetic facial expressions, speech and textual content, and the ability for imitating and social learning. Past work on creating robots that can make convincing emotional expressions has concentrated on the quality of those expressions, and on assessing people‘s ability to recognize them. Previous recognition studies presented the facial expressions of the robots in neutral contexts, without any strong emotional valence (e.g., emotionally valenced music or video). It is therefore worth empirically exploring whether observers‘ judgments of the facial cues of a robot would be affected by a surrounding emotional context. This thesis takes its inspiration from the contextual effects found on the interpretation of the expressions on human faces and computer avatars, and looks at the extent to which they also apply to the interpretation of the facial expressions of a mechanical robot head. The kinds of contexts that affect the recognition of robot emotional expressions, the circumstances under which such contextual effects occur, and the relationship between emotions and the surrounding situation, are observed and analyzed in a series of 11 experiments. In these experiments, the FACS (Facial Action Coding System) (Ekman and Friesen, 2002) was applied to set up the parameters of the servos to make the robot head produce sequences of facial expressions. Four different emotional surrounding or preceding contexts were used (i.e., recorded BBC News pieces, selected affective pictures, classical music pieces and film clips). This thesis provides evidence that observers‘ judgments about the facial expressions of a robot can be affected by a surrounding emotional context. From a psychological perspective, the contextual effects found on the robotic facial expressions based on the FACS, indirectly support the claims that human emotions are both 4 biologically based and socially constructed. From a robotics perspective, it is argued that the results obtained from the analyses will be useful for guiding researchers to enhance the expressive skills of emotional robots in a surrounding emotional context. This thesis also analyzes the possible factors contributing to the contextual effects found in the original 11 experiments. Some future work, including four new experiments (a preliminary experiment designed to identify appropriate contextual materials and three further experiments in which factors likely to affect a context effect are controlled one by one) is also proposed in this thesis. 5 Acknowledgements For the PhD thesis I presented here, first and foremost, I would like to express my sincere gratitude to my co-supervisors: Dr. Amanda Sharkey and Prof. Noel Sharkey, especially Dr. Amanda Sharkey, without whom I would never receive so much enthusiastic advice, support and guidance at every stage during the PhD period. All the participants in my experiments, without whom no sufficient data could be collected and this thesis would not be complete. I am grateful to the Department of Computer Science in the University of Sheffield. Here I was trained to be a researcher and was offered the job of a teaching assistant. It has provided many opportunities for my professional development. I would like to thank a few people whom I am very much indebted to. Dr. Gordon Manson, who helped me to replace the Tinibox with Java Sun Spot to enable the mechanical robot head CIM to be emotional, has provided me crucial technical support. Dr. Sifat Momen and Dr. Sanaz Jabbari, former PhD students in the same lab, have given me lots of helpful advice for my study and life in Sheffield. Furthermore, I would like to thank my friends. In particular, Mr. Weiwei Chen, Mr. Junyu Wang and Dr. Jionglong Su have inspired me with creative ideas through our many conversations. I am also indebted to all other friends who have offered me support without which my life in the UK would be arduous. Dr. Yi Li and his adorable family, who keep inviting me to their home regularly and treating me hospitably, have substantially enriched my life here. Last but not least, my lovely family in China, from which I receive generous financial and spiritual support at all times. 6 7 Table of Contents CHAPTER 1: INTRODUCTION 16 1.1 Robotic Facial Expressions: from Psychology to Technology 17 1.2 Emotion and Context 19 1.3 Thesis Objectives 23 1.4 Contributions and Relevant Publications 25 1.5 Organization of the Thesis 27 CHAPTER 2: PSYCHOLOGY OF EMOTION 29 2.1 Classical Theories 29 2.1.1 Evolutionary Theories 29 2.1.2 Cognitive-appraisal Theories 36 A. Appraisal Theories of Emotion 36 B. Comparing Evolutionary and Appraisal Accounts 41 2.1.3 Social Constructionist Theories 43 2.2 Novel Approaches 47 2.2.1 Theory of Embodying Emotion 47 2.2.2 Embodied Appraisal Theory 50 2.2.3 Social Functionalist Approach 52 2.2.4 Perceptual Control Theory on Emotions 54 2.2.5 Rolls‘s Theory of Emotion 56 2.3 Summary and Conclusion 61 CHAPTER 3: SYNTHETIC ROBOT FACIAL EXPRESSIONS 65 3.1 Social Robot Embodiment 67 3.1.1 Humanoids 68 3.1.2 Androids 68 3.1.3 Creature-like Social Robots 69 3.1.4 Non-humanoid and Non-zoomorphic Social Robots 70 3.2 Creating Recognizable Facial Expressions for Social Robots 71 3.2.1 Examples of Non-Android Robots 73 A. Kismet 73 B. Probo 77 3.2.2 Examples of Android Robots 81 A. Geminoid F 81 B. Jules 84 8 3.2.3 Summary of the Methods 86 3.3 “Social” Aspects of the Design of a Human-robot Interaction 88 3.3.1 Robot Attributes 88 3.3.2 User Attributes 90 3.3.3 Task Structure 91 3.4 Summary and Conclusion 91 CHAPTER 4: CONTEXTUAL EFFECTS ON JUDGMENTS ABOUT SYNTHETIC ROBOT FACIAL EXPRESSIONS 96 4.1 Generic Experimental Method 96 4.1.1 The Motivation of This Thesis 96 4.1.2 The Hypotheses 99 4.1.3 The Dependent Variable 100 4.1.4 The Independent Variable 102 4.1.5 The Experimental Conditions 103 4.1.6 The Number of the Experiments 105 4.1.7 The Generic Experimental Procedures 107 4.1.8 The Statistical Analysis Techniques 109 4.2 Experiment Speech1 110 4.2.1 Experimental Results 112 4.2.2 Discussion and Conclusion 115 4.3 Experiment Speech2 120 4.3.1 Experimental Results 121 4.3.2 Discussion and Conclusion 124 4.4 Experiment Image1 125 4.4.1 Experimental Results 127 4.4.2 Discussion and Conclusion 129 4.5 Experiment Image2 130 4.5.1 Experimental Results 131 4.5.2 Discussion and Conclusion 134 4.6 Experiment Image3 135 4.6.1 Experimental Results 135 4.6.2 Discussion and Conclusion 138 4.7 Experiment Music1 138 4.7.1 Experimental Results 140 4.7.2 Discussion and Conclusion 143 4.8 Experiment Music2 144 4.8.1 Experimental Results 145 4.8.2 Discussion and Conclusion 149 4.9 Experiment Music3 150 4.9.1 Experimental Results 151 4.9.2 Discussion and Conclusion 154 9 4.10 Experiment Video1 155 4.10.1 Experimental Results 156 4.10.2 Discussion and Conclusion 158 4.11 Experiment Video2 159 4.11.1 Experimental Results 160 4.11.2 Discussion and Conclusion 163 4.12 Experiment Video3 164 4.12.1 Experimental Results 165 4.12.2 Discussion and Conclusion 167 4.13 Summary and Conclusion 167 CHAPTER 5: AN ANALYSIS OF THE METHODOLOGY 180 5.1 A Further Experiment 180 5.1.1 The Motivation of the Experiment 180 5.1.2 The Experimental Conditions 181 5.1.3 Experimental Procedures 181 5.1.4 Experimental Results 183 5.1.5 Conclusion and Discussion 189 5.2 Possible Factors Contributing to the Contextual Effects Found in the Previous 11 Experiments 190 5.2.1 The Choices of the Contexts 190 A. The Content of the Contexts 191 B. The Source Clarity of the Contexts 192 C. The Difference Between One Modality and Multiple Modalities 195 5.2.2 The Rating Scheme 195 5.2.3 The Mood Effect 197 5.2.4 The Manner of Presenting the Robot Face and the Context 197 A. The Style of the Presentation of a Robot Face and a Context 198 B. The Contrast Effect 199 5.3 Summary and Conclusion 200 CHAPTER 6: PROPOSED NEW EXPERIMENTS ADDRESSING THE IMPACT OF FILM CLIPS 202 6.1. The Preliminary Experiment 203 6.1.1 The Motivation of the Preliminary Experiment 203 6.1.2 The Experimental Conditions 206 6.1.3 The Generic Experimental Procedures 206 6.1.4 The Statistical Analysis Techniques 208 6.1.5 Predictions of the Experimental Results 208 6.2. The Proposed Three New Experiments 211 6.2.1 The Motivation of the New Experiments 211 6.2.2 The Hypotheses 214 6.2.3 The Dependent Variable and the Independent Variable 215 6.2.4 The Experimental Conditions 216 6.2.5 The Generic Experimental Procedures 217 10 6.2.6 The Statistical Analysis Techniques 218 6.2.7 Predictions of the Experimental Results 219 6.3 Summary 221 CHAPTER 7: SUMMARY AND FUTURE WORK 223 7.1 Summary of the Thesis 224 7.2 Future Work 228 REFERENCES 231 APPENDIX Ⅰ: A LIST OF THE ACTUAL

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    284 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us