Analysis of Contextual Emotions Using Multimodal Data
Total Page:16
File Type:pdf, Size:1020Kb
Analysis of Contextual Emotions Using Multimodal Data by Saurabh Hinduja A thesis submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Computer Science and Engineering College of Engineering University of South Florida Major Professor: Shaun Canavan, Ph.D. Rangachar Kasturi, Ph.D. Jeffrey F. Cohn, Ph.D. Marvin J. Andujar, Ph.D. Pei-Sung Lin, Ph.D. Elizabeth Schotter, Ph.D. Date of Approval: March 31, 2021 Keywords: Expression, Action Units, Physiological Signals Copyright © 2021, Saurabh Hinduja Dedication I would like to dedicate this dissertation to my family and my niece, Ria. They have been with me every step of the way through good times and bad. Their unconditional love, guidance, and support have helped me succeed in efficiently completing my Doctoral degree. They are and will always remain my source of inspiration. Acknowledgments I would like to express my deepest gratitude to Dr. Shaun Canavan and Dr. Jeffrey F Cohn for giving me the opportunity to explore my potential. They have been excellent mentors and constant sources of encouragement and support. Thank you for spending a lot of your valuable time in indulging in insightful conversations that led to the design and development of my research. I would also like to thank Dr. Rangachar Kasturi for motivating me to pursue my doctoral degree. I would like to thank Dr. Rangachar Kasturi, Dr. Jeffrey F Cohn, Dr. Pei-Sung Lin, Dr. Marvin J Andujar and Dr. Elizabeth Schotter for agreeing to be a part of my defense committee. I appreciate their valuable feedback and review comments on this work. I would also like to thank Dr. Lijun Yin for his valuable time and feedback on my publications. I would like to thank my fellow researchers, Saandeep Aathreya and Shivam Srivastava for being a constant source of inspiration. Finally, I would like to thank my close friends Gurmeet Kaur and Aleksandrina D. Davidson for their constant support during my doctoral degree. Table of Contents List of Tables . iv List of Figures . vii Abstract . ix Chapter 1: Introduction . 1 1.1 Motivation . 1 1.2 Problems and Open Questions in Affective Computing . 2 1.3 Related Works . 3 1.4 Contribution . 6 1.4.1 Self-Reported Emotions . 7 1.4.2 Impact of Action Unit Patterns . 7 1.4.3 Multimodal Pain Detection . 8 1.4.4 Context recognition from Facial Movements . 8 1.5 Organization . 8 Chapter 2: Recognizing Self Reported Emotions from Facial Expressions . 10 2.1 Introduction . 10 2.2 Analysis of Self-Report Emotions . 12 2.2.1 Dataset . 12 2.2.2 Subject self-reporting . 17 2.3 Recognizing Self-Report Emotions . 19 2.3.1 Experimental Design . 19 2.3.1.1 Data Preprocessing . 19 2.3.1.2 Proposed 3D CNN . 19 2.3.1.3 Data Validation . 21 2.3.2 Results . 22 2.4 Conclusion . 22 Chapter 3: Impact of Action Unit Occurrence Patterns on Detection . 24 3.1 Introduction . 24 3.2 Action Unit Occurrence Patterns . 30 3.2.1 Datasets . 30 3.2.2 Class Imbalance and AU Detection . 31 i 3.2.3 Action Unit Occurrence Patterns . 37 3.3 Impact of AU Patterns on Detection . 40 3.3.1 Multi-AU detection (Experiment 1) . 42 3.3.2 Single-AU Detection (Experiment 2) . 46 3.4 Training on AU Occurrence Patterns . 47 3.4.1 Neural Network Architectures . 47 3.4.2 Top 66 AU Occurrence Patterns (Experiment 3) . 48 3.4.3 All AU Occurrence Patterns (Experiment 4) . 51 3.5 Conclusion . 51 Chapter 4: Multimodal Fusion of Physiological Signals and Facial Action Units for Pain Recognition . 53 4.1 Introduction . 53 4.2 Fusion and Experimental Design . 56 4.2.1 Multimodal Dataset (BP4D+) . 56 4.2.2 Syncing Physiological Signals with AUs . 56 4.2.3 Fusion of Physiological Signals and AUs . 57 4.2.4 Pain Recognition Experimental Design . 58 4.3 Results . 58 4.3.1 Pain Recognition . 58 4.3.2 Comparison to State of the Art . 60 4.4 Discussion . 61 4.5 Conclusion . 64 Chapter 5: Recognizing Context Using Facial Expression Dynamics from Ac- tion Unit Patterns . 65 5.1 Introduction . 65 5.2 Related Works . 67 5.2.1 Action Unit Detection . 68 5.2.2 Expression Recognition and Action Units . 69 5.3 Modeling Temporal Dynamics with Action Unit Patterns . 70 5.3.1 Motivation and Background . 70 5.3.2 Modeling Temporal Dynamics . 72 5.3.2.1 AU Occurrence Pattern Modeling . 73 5.3.2.2 AU Intensity Pattern Modeling . 73 5.3.3 Justification for AU Pattern Images . 74 5.4 Datasets . 76 5.4.1 DISFA . 76 5.4.2 BP4D . 78 5.4.3 BP4D+ . 78 5.5 Experimental Design . 79 5.5.1 Neural Network Architectures . 80 5.5.1.1 Convolutional Neural Network . 80 5.5.1.2 Feed-forward Fully-connected Network . 81 5.5.2 Experiments and Metrics . 81 ii 5.5.2.1 DISFA . 81 5.5.2.2 BP4D and BP4D+ . 82 5.5.3 Metrics . 83 5.6 Results . 84 5.6.1 DISFA . 85 5.6.1.1 Context . 85 5.6.1.2 Target Emotion from Context . 87 5.6.1.3 Target Emotion . 88 5.6.2 BP4D and BP4D+ . 90 5.6.2.1 BP4D Context Recognition . 90 5.6.2.2 BP4D+ Context Recognition . 91 5.6.2.3 BP4D vs. BP4D+ Context Recognition . 93 5.6.3 Network Attention Maps . 96 5.6.4 Automatic AU Detection . 99 5.7 Discussion . 101 Chapter 6: Conclusion . 103 6.1 Findings . 103 6.1.1 Self-reported emotions . 103 6.1.2 Impact of Action Unit Occurrence Distribution on Detection 103 6.1.3 Fusion of Physiological Signals and Facial Action Units for Pain Recognition . 103 6.1.4 Recognizing Context Using Facial Expression Dynamics from Action Unit Patterns . 104 6.2 Applications . 104 6.3 Limitations . 105 6.4 Future Work . 107 References . 108 Appendix A: Dictionary . 128 Appendix B: Attention Maps . 129 Appendix C: Copyright Permissions . 139 About the Author . .End Page iii List of Tables Table 2.1 Task Descriptions of BP4D+ . 13 Table 2.2 Percent of subjects which felt emotion in each task. Rows are subject self-reporting, columns are tasks . 13 Table 2.3 Percentage of male subjects that felt emotion in each task . 14 Table 2.4 Percentage of female subjects that felt emotion in each task . 14 Table 2.5 Significance of differences between male and female intensity of self-reported emotion . 15 Table 2.6 Significance of differences between male and female occurrence of self-reported emotion . 16 Table 2.7 Average intensity of self-report emotions across all tasks (BP4D+) for male and female . 20 Table 2.8 Perceived emotion evaluation metrics. 21 Table 2.9 Variance and standard deviation of accuracy for recognizing per- ceived emotion. 22 Table 3.1 Number of AUs and Occurrence Criteria of datasets . 31 Table 3.2 Correlation between F1-binary scores and AU class imbalance. 32 Table 3.3 Standard deviation of F1-binary scores, for each individual AU, between all methods . 34 Table 3.4 BP4D AU occurrence patterns and total number of frames for each pattern . 34 Table 3.5 DISFA AU occurrence patterns and total number of frames for each pattern . 35 Table 3.6 BP4D+ AU occurrence patterns and total number of frames for each pattern . ..