DEVELOPMENT OF EMOTION EXPRESSIVE HUMANOID HEAD

BY

MOHD FARID BIN MD ALIAS

A dissertation submitted in fulfilment of the requirement for the degree of Master of Science (Mechatronics Engineering)

Kulliyyah of Engineering International Islamic University Malaysia

MARCH 2012 ABSTRACT

Advances in robotics technology have paved way to the innovation of sophisticated with improved capacity and capability to perform tasks such as humanoids. As humanoids are gradually adapted within human-existing environment, interactions with humans are inevitable and thus there is a need to develop human-friendly humanoids particularly with expressive face to portray emotions. These robots, known as humanoid heads, can be of anthropomorphic (human-like) or iconic (cartoonish) look. Iconic humanoid heads are generally less prone to anthropomorphism pitfalls of Mori’s Uncanny Valley theory. However, in terms of facial elements, some iconic humanoid head designs tend to exclude the mouth cues even though the mouth region can be effective in conveying emotional displays. In this thesis, a new humanoid head design with unique mouth mechanism is presented. The humanoid head, named as AMIR-III is designed and developed based on AMIR Model of Facial Expression (AMEr). AMIR-III possesses 18 DOFs in total and two eyes with built-in cameras. Together with other facial cues, its 3-DOF mouth forms the basic facial expressions of AMIR-III namely happy, sad, angry and surprised. From an expression recognition survey, the recognition rates of AMIR-III facial expressions are found to be generally recognizable to human beings and at par with recognition rates of other humanoid heads in comparison with slight variation on its happy expression.

ii

مل ّخص البحث

التقدم في تكنولوجيا الروبوت قد مهدت الطريق البتكار روبوتات متطورة مع تحسين القدرة والقدرة على أداء مهام مثل الروبوت.كما يتم تكييفها تدريجيا الروبوت داخل اإلنسان القائمة والبيئة ، والتفاعل مع البشر أمر ال مفر منه، وبالتالي هناك حاجة لتطوير اإلنسان ودية الروبوت خاصة مع وجهه معبرة لتصوير العواطف. يمكن لهذه الروبوتات، والمعروفة باسم رؤساء الروبوت، يمكن من نظرة )الكرتوني( مجسم )مثل اإلنسان( أو مبدع. رؤساء الروبوت األيقونية هي عموما أقل عرضة للمطبات التجسيم نظرية موري وادي خارقة. ومع ذلك، من حيث عناصر الوجه ، وبعض رأس الروبوت مبدع تصاميم تميل إلى استبعاد العظة الفم على الرغم من أن منطقة الفم يمكن أن يكون فعاال في نقل يعرض العاطفي. في هذه األطروحة، يقدم رئيس الروبوت تصميم جديد مع آلية الفم فريدة من نوعها. رئيس الروبوت ويدعى أمير الثالث يملك 18 في المجم وع DOFsوالعينين مع اثنين من المدمج في الكاميرات. مع اشارات الوجه اآلخر ، في 3 شعبة الشؤون المالية الفم أشكال التعبيرات األساسية لألمير الوجه الثالث وهي سعيدة، حزينة، والغضب والدهشة. استنادا إلى مسح أجري االعتراف التعبير، فإن معدالت االعتراف أمير تعابير الوجه الثالث عموما التعرف على البشر وعلى قدم المساواة مع معدالت االعتراف رؤساء الروبوت األخرى في المقارنة ، باستثناء تعبيرها سعيد الذي األدنى نسبيا بمعدل 60 في المئة من االعتراف. هذه الظاهرة هي من المفترض بسبب وجود محيط الفم من التعبير سعيدة بالمقارنة مع صيغته المحاكاة الذي يولد ربما تفسيرات متباينة بين البشر.

iii

APPROVAL PAGE

I certify that I have supervised and read this study and that in my opinion, it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Master of Science in Mechatronics Engineering.

………………………………... Amir Akramin Shafie Supervisor

………………………………... Nahrul Khair Alang Md. Rashid Co-Supervisor

I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Master of Science in Mechatronics Engineering.

……………………………….. Md. Raisuddin Khan Examiner

This dissertation was submitted to the Department of Mechatronics and is accepted as a fulfilment of the requirement for the degree of Master of Science in Mechatronics Engineering.

……………………………….. Asan Gani Abdul Muthalif Head, Department of Mechatronics

This dissertation was submitted to the Kulliyyah of Engineering and is accepted as a fulfilment of the requirement for the degree of Master of Science in Mechatronics Engineering.

……………………………… Amir Akramin Shafie Dean, Kulliyyah of Engineering

iv

DECLARATION

I hereby declare that this dissertation is the result of my own investigations, except where otherwise stated. I also declare that it has not been previously or concurrently submitted as a whole for any other degrees at IIUM or other institutions.

Mohd Farid Bin Md Alias

Signature …………………………………… Date ………….……………..

v

INTERNATIONAL ISLAMIC UNIVERSITY MALAYSIA

DECLARATION OF COPYRIGHT AND AFFIRMATION OF FAIR USE OF UNPUBLISHED RESEARCH

Copyright © 2012 by International Islamic University Malaysia. All rights reserved.

DEVELOPMENT OF EMOTION EXPRESSIVE HUMANOID HEAD

I hereby affirmed that The International Islamic University Malaysia (IIUM)

holds all rights in the copyright of this Work and henceforth any reproduction or

use in any form or by means whatsoever is prohibited without the written

consent of IIUM. No part of this unpublished research may be reproduced,

stored in a retrieval system, or transmitted in any form or by means, electronic,

mechanical, photocopying, recording or otherwise without prior written

permission of the copyright holder.

Affirmed by MOHD FARID BIN MD ALIAS

……………………………. ...……………….. Signature Date

vi

To my beloved wife Nurul Izzah, my daughter Nurul Farihah and my son Ammar Faiz

vii

ACKNOWLEDGEMENTS

All the praise to Allah SWT who Has Given me the strength, wisdom and patience to finally complete this dissertation, alhamdulillah. Here I wish to acknowledge the people who have contributed their valuable assistance in the preparation and completion of this dissertation. First and foremost, I am heartily thankful to my main supervisor, Assoc. Prof. Dr. Amir Akramin Shafie, for your valuable guidance and support since the start of this research until it could be completed, and to my co- supervisor Assoc. Prof. Dr. Nahrul Khair Alang Md Rashid for your encouragement and understanding. To my wife Nurul Izzah Sidek, my daughter, Nurul Farihah and my son Ammar Faiz, I really appreciate your love and understanding during my busy research activities. To my scholarship sponsors, the Ministry of Higher Education (MoHE), I really appreciate the financial support which helps me to continue my living as a student. To my research teammates, Aseef Iqbal, Jamil Radhi, Mehdi, Hamizah, it is indeed an enjoyable experience working together with all of you. I owe my gratitude to Brother Shahlan Dalil of Mechatronics Workshop, Brother Ramly of CNC Milling Lab and Brother Hafizul Zikri of Software Lab for allowing and assisting me to use your lab facilities in the course of this research. Special thanks to Prof. Momoh Jimoh E. Salami for his kind review of my final dissertation draft particularly in terms of formatting. I would like to thank Asst. Prof. Dr. Aisha Hassan Abdalla Hashim for helping me to write my abstract in Arabic. I also wish to express my deepest gratitude to my parent and parent-in-law, for your continuous support, dedication, comprehension and love. Lastly, thank you to all of those who supported me in any respect during the completion of this dissertation.

viii

TABLE OF CONTENTS

Abstract ……………………………………………...………...……………...... ii Abstract in Arabic ...... iii Approval Page ...... iv Declaration Page ...... ……………………………………………………...... ….. v Copyright Page …..………………………………………………………………….. v i Dedication …………...……………………………………………...…………....…. vii Acknowledgements ……………………………………………………………...... viii List of Tables ……………………………………………………………..……...... xii List of Figures ………………….....………………………………...…………...... xiii List of Abbreviations ……………………………………………………………...... xvi List of Symbols ...... xvii

CHAPTER 1: INTRODUCTION ………………………………………...…...... 1 1.1 Overview ……………………………………………………………..….. 1 1.2 Problem Statement and Its Significance .……………………...…...…… 6 1.3 Research Objectives …………………………………………...………… 8 1.4 Scope of Research ...……………………………………………...……… 8 1.5 Research Methodology ……………………………………………...... 8 1.6 Dissertation Outline ...………………………………………………….... 10

CHAPTER 2: LITERATURE REVIEW……………………...………………… 11 2.1 Introduction ……………………………………………………………… 11 2.2 Anatomy of Human Head and Neck …………………………………….. 11 2.2.1 The Skeletal System of Human Head and Neck ………………..... 11 2.2.2 The Muscular System of Human Head and Neck ………………... 12 2.2.3 Human Vision ……………………………………...…………….. 13 2.3 Anthropometric Data ……………………………………………...…...... 14 2.3.1 Static Anthropometric Data……………………………………….. 14 2.3.2 Dynamic Anthropometric Data …………………………………... 16 2.4 Facial Action Coding System (FACS) ……………………………...... 17 2.5 HRI Experiment ...... 18 2.6 Humanoid Heads …………………………………………….………...… 18 2.6.1 WE4-RII Humanoid Head ……………………………………...... 19 2.6.2 iCub Humanoid Head …………………………………………...... 20 2.6.3 Nexi Mds Humanoid Head ……………………………………...... 20 2.6.4 Flobi Humanoid Head …………………………………………..... 21 2.6.5 Meka S1 Humanoid Head ………………………………...……… 22 2.6.6 BERT2 Humanoid Head …………………………………….…… 22 2.6.7 AMIR-I Humanoid Head ………………………………...………. 23 2.6.8 AMIR-II Humanoid Head ……………………………………...... 24 2.7 Comparison of Humanoid Heads ...... ………………………..…...... 25 2.8 Summary ……………………………………………………………….... 30

ix

CHAPTER 3: MODELLING AND DESIGN OF HUMANOID HEAD …...... 31 3.1 Introduction ...... 31 3.2 AMIR-III Model Of Facial Expression (AMEr) ...... 31 3.2.1 Eyebrow Movement Codes ...... 32 3.2.2 Eyelid Movement Codes...... 33 3.2.3 Mouth Movement Codes ...... 34 3.2.4 Overall Facial Movement Codes ...... 34 3.2.5 Emotional Interpretation of Facial Expression ...... 36 3.3 DOF Configuration ...... 37 3.3.1 Eyebrow DOFs ...... 37 3.3.2 Eyelid DOFs ...... 37 3.3.3 Eye DOFs ...... 37 3.3.4 Mouth DOFs ...... 38 3.3.5 Neck DOFs ...... 38 3.3.6 Overall DOF Configuration ...... 38 3.4 Mechanical Design ...... 39 3.4.1 Actuator Selection ...... 39 3.4.2 CAD Software ...... 40 3.4.3 Face Casing Design ...... 40 3.4.4 Eyebrow Design ...... 41 3.4.5 Eyelid Design ...... 44 3.4.6 Mouth Design ...... 47 3.4.7 Eye Design ...... 50 3.4.8 Neck Design ...... 53 3.4.9 Complete 3D Model ...... 55 3.4.10 Fabrication and Assembly Process ...... 56 3.5 Electrical and Electronics Design ...... 57 3.5.1 Central Processing Unit (CPU) ...... 57 3.5.2 Power Supply ...... 57 3.5.3 Communication ...... 58 3.6 Software Design ...... 59 3.6.1 Commanding the Servos ...... 59 3.6.2 Face Detection...... 60 3.6.3 Control System ...... 60 3.6.4 Program ...... 61 3.7 Kinematic Model ...... 61 3.7.1 2-DOF Head-Camera (Eye Restricted) ...... 63 3.8 Summary ...... 68

CHAPTER 4: RESULTS AND DISCUSSION ……………………...... 69 4.1 Introduction …………………………………………………...... ……….. 69 4.2 Expression Recognition Assessment Setup ...... 69 4.3 Results and Discussion ……………………….………………………… 70 4.3.1 Recognition Rate for Simulated and Real Facial Expressions ...... 7 0 4.3.2 Respondent Agreements toward Suggested Expression Labels ...... 7 4 4.3.3 Comparison of Recognition Rates with other Humanoid Heads .... 76 4.4 Summary ...... 77

x

CHAPTER 5 CONCLUSION AND RECOMMENDATION 78 5.1 Conclusion ……………………………………………….……………… 78 5.2 Recommendation …………….…………..………...…….……………… 78

BIBLIOGRAPHY …………..………………………………………...…...……..... 80

LIST OF PUBLICATIONS ……………….……………………………...……...... 85

APPENDIX A: DYNAMIXEL AX-12+ DATA SHEET …………………...... 86 APPENDIX B: MICROSOFT LIFECAM CINEMA DATA SHEET …...……...…. 87 APPENDIX C: DYNAMIXEL RX-64 DATA SHEET ....………………………….. 88 APPENDIX D: EXPRESSION RECOGNITION SURVEY QUESTIONNAIRE .... 89

xi

LIST OF TABLES

Table No. Page No.

2.1 Definition and male percentile values of human’s head characteristics 15 according to the assigned numbers

2.2 Neck movement ranges 16

2.3 List of FACS AUs with descriptions and muscular basis 17

2.4 Comparisons of detailed specifications and expression recognition 26 rates among selected humanoid heads

3.1 Corresponding AUs between FACS and AMEr 35

3.2 AMEr emotional interpretations with their corresponding 36 AU combinations

3.3 Joint names with their respective labels, servo IDs and soft 59 operating ranges

3.4 D-H parameters for 2-DOF neck-camera model 64

4.1 Comparison of expression recognition rates among WE-4RII, 76 Flobi, BERT2 and AMIR-III

xii

LIST OF FIGURES

Figure No. Page No.

1.1 Current humanoid robots 2

1.2 Earlier humanoid heads 4

1.3 Examples of humanoid heads with different appearance themes 5

1.4 Uncanny Valley Theory 6

1.5 Recent humanoid heads 7

1.6 Research methodology flowchart 10

2.1 Bones of human skull 12

2.2 Cervical vertebrae from lateral view 12

2.3 Human face muscles from anterior view 13

2.4 Illustrated head characteristics of a male with their assigned numbers 15

2.5 Illustrated neck movements 16

2.6 Some of FACS AU locations on human face 18

2.7 WE4-RII humanoid head 19

2.8 WE4-RII’s four basic facial expressions plus the neutral expression 19

2.9 iCub humanoid head 20

2.10 Various iCub’s facial expressions with LED-projected eyebrows 20 and mouth

2.11 Nexi MDS humanoid head 21

2.12 Flobi humanoid head 21

2.13 Flobi’s five basic facial expressions plus the neutral expression 22

2.14 Meka S1 humanoid head 22

2.15 BERT2 humanoid head 23

2.16 AMIR-I humanoid head 23

xiii

2.17 AMIR-II humanoid head 24

2.18 AMIR-II expressing four basic facial expressions plus the neutral one 24

3.1 Five-step morphology from a human face to the conceptual 32 iconic AMIR-III

3.2 AMIR-III conceptual face design 32

3.3 Corresponding eyebrow AUs between FACS and AMEr 33

3.4 Corresponding eyelid AUs between FACS and AMEr 34

3.5 Corresponding mouth AUs between FACS and AMEr 34

3.6 Whole AMEr AU assignments on AMIR-III’s face 35

3.7 Resultant AMIR-III facial expressions as respective AU 36 combinations are simulated on its face

3.8 Dynamixel AX-12+ servo 39

3.9 AMIR-III face casing with its dimensions 41

3.10 AMIR-III eyebrow mechanism 42

3.11 Rear view of eyebrow tilting mechanism and the parameters 43 involved in its torque calculation.

3.12 AMIR-III eyelid part consists of four four-bar linkages 44

3.13 AMIR-III eyelid tilting mechanism 45

3.14 AMIR-III eyelid rolling mechanism 46

3.15 AMIR-III mouth mechanism 47

3.16 AMIR-III side lip submechanism 48

3.17 Rear view of AMIR-III side lip 49

3.18 AMIR-III jaw mechanism 50

3.19 Microsoft LifeCam Cinema 51

3.20 Arrangement of all AMIR-III head parts excluding the eye part 52

3.21 AMIR-III eye mechanism 53

3.22 AMIR-III serial neck mechanism 53

3.23 Head weight acting on neck tilt joint 55

xiv

3.24 AMIR-III complete 3D model. 55

3.25 Painted face casing with smooth and glossy finish 56

3.26 Photos of AMIR-III prototype 57

3.27 SMPS2Dynamixel module 58

3.28 Close-up view of USB2Dynamixel with parts labelled 58

3.29 Conceptual block diagram of AMIR-III face tracking system 60

3.30 AMIR-III Matlab-based graphical user interface 61

3.31 AMIR-III DOF configuration 62

3.32 Reference frames for the neck-eye-camera model 63

4.1 Pictures of AMIR-III simulated facial expressions shown 70 sequentially to the respondents from left to right

4.2 AMIR-III facial expressions portrayed sequentially 70 to the respondents from left to right

4.3 Respondent perceptions toward simulated AMIR-III facial 71 expressions

4.4 Respondent perceptions toward real AMIR-III facial expressions 72

4.5 Respondent agreements toward suggested expression labels 75

xv

LIST OF ABBREVIATIONS

AI Artificial intelligence AMEr AMIR Model of Expression AU Action units COG Centre of gravity DOF Degree of freedom FACS Facial Action Coding System FOV Field of view FPS Frame per second GUIDE Graphical User Interface Development Environment HRI Human-robot interaction KAIST Korea Advanced Institute of Science and Technology LCD Liquid crystal display LED Light emitting diode MIT Massachusetts Institute of Technology OIML International Vocabulary of Metrology WE Waseda Eye WU Waseda University

xvi

LIST OF SYMBOLS

푖 푇푗 Total transformation from joint i to joint j Ai Transformation matrix i ai Translation distance i along z-axis C Cosine function di Translation distance i along x-axis Fi Force i Ji Jacobian i S Sine function α Rotation about x-axis θi Angle i τi Torque i

xvii

CHAPTER 1

INTRODUCTION

1.1 OVERVIEW

Advances in robotics technology have paved way to the innovation of sophisticated robots with improved capacity and capability to perform tasks. One such complex robotics system is ; a kind or robot which possesses human-inspired physicality and cognition.

Currently, there are many researches on development of humanoid robots by organizations and institutions worldwide. For example, Honda Motor Corporation

(2010) has committed its resource into developing its own humanoid robot since the decade of 1980’s. Their latest complete humanoid robot known as ASIMO as shown in Figure 1.1(a) is equipped with advanced physical mobility, environment adaptability and artificial intelligence. Honda ASIMO has now acquired the ability to walk at 2.7 km/h, run at 6 km/h, climb stairs, carry items, recognize people and avoid obstacles. Besides that other notable full-body humanoid robots are KHR-3 (Park,

Kim, Lee & Oh, 2005) of Korea Advanced Institute of Science and Technology

(KAIST), SURENA-II (Guizzo, 2010) of Tehran University, and WABIAN-2R

(Omer, Ghorbani, Lim & Takanishi, 2009) of Waseda University (WU) as shown in

Figure 1.1(b), (c) and (d) respectively.

Due to their human-like physical embodiments, humanoid robots could possibly be developed into producing human-like movements such as walking, running, handling objects and jumping (Nunez & Nelly, 2006). With such physical mobility, humanoid robots could assist humans to perform various human tasks

1 especially those of dangerous, repetitive, tedious, or precise in nature, and thus extending human potentials to achieve better work performances. Furthermore, humanoid robots possess the capacity to suit into many workspaces and tools designed for human use, hence less modifications are required onto the current human working environments for humanoid robots to operate.

(a) (b) (c) (d)

Figure 1.1: Current humanoid robots. From left to right: (a) ASIMO (Honda Corporation, 2010) (b) KHR-3 (Park et al., 2005) (c) SURENA-II (Guizzo, 2010) (d) WABIAN-2R (Omer et al., 2009)

As humanoid robots are gradually adapted within human-existing environment, interactions with humans are inevitable and important as well as understanding of their needs and feedbacks. This has led researchers to study the human-robot interaction (HRI) in order to generate safe, effective, efficient, and friendly interaction. Human safety should be guaranteed if humanoid robots would operate in close proximity with humans in which the robots should not pose any injury risks to the people around them and possess their own fail-safe mechanisms in case of any system failures. They should also be capable of executing their task objectives

2 effectively and efficiently as commanded by humans. Besides that, the robots are also expected to be human-friendly which means that they are capable of “easily receiving commands from humans and reporting the execution information in a proper human way” (Galindo, Fernandez-Madrigal & Gonzalez, 2007: 120).

To develop human-friendly humanoid robots, they must be endowed with human-inspired cognitive capacities to adapt themselves with the dynamics of their tasks, situations and humans. Moreover, the robots would also require user interfaces to communicate their thoughts and emotions to humans. Such purpose can be fulfilled by the head part of the humanoid robot equipped with expressive face and human-like modalities such as seeing (vision) and hearing (audio). By having expressive face a robot could interact more effectively with humans since facial expression accounts for

55% of communication effect as compared to the purely verbal component (7%) as suggested by Mehrabian and Friar (1969).

A robotic head, also known as a humanoid head, is a robotic head which possess human-like physicality, modalities and cognition. Active developments of modern humanoid heads have started since 1990’s. Among those pioneer humanoid heads are KISMET (Breazeal & Scassellati, 1999) of Massachusetts Institue of

Technology (MIT) and Waseda Eye (WE) No.2 or WE-2 (Takanishi, Matsuno &

Kato, 1997) of WU. KISMET as shown in Figure 1.2(a) could display 9 basic facial expressions, while WE-2 as shown in Figure 1.2(b) could only realized the coordinated head-eye motion using vestibular-ocular reflex (Takanishi Lab of WU,

2006). The internal mechanisms of those early robots were visible and without any face casings since the development of mechanical system was the main focus at that moment. Gradually, more sensors and actuators were integrated into the newer humanoid heads. WE-3RIII of the WE robot series as shown in Figure 1.2(c) for

3 example has been equipped with auditory, cutaneus, tactile and temperature sensors

(Takanishi, Sato, Segawa, Takanobu & Miwa, 2000). Presently, the designs of outer appearance become more realistic and aesthetic with the use of elastic artificial skins and 3D printers. For instance, Hanson Robotics uses their own novel porous skin rubber material called Frubber for the skin part of their humanoid heads (Hanson,

Olney, Perreira & Zielke, 2005).

(a) (b) (c)

Figure 1.2: Earlier humanoid heads. From left to right: (a) KISMET (Bryant, 2010) (b) WE-2 (Takanishi Lab of WU, 2006) (c) WE-RIII (Takanishi Lab of WU, 2006)

In general there are two themes of humanoid head appearances, namely anthropomorphic (human-like) theme and iconic (cartoonish) theme. An anthropomorphic humanoid head, for example (Oh et al., 2006) of

KAIST as shown in Figure 1.3(a) is generally characterized with the presence of artificial hair and elastic skins which make it appear very much like a living human head. On the other hand, an iconic humanoid head, for instance Flobi (Lutkebohle et al., 2010) as shown in Figure 1.3(b) of Bielefeld University, basically has fixed face casing of cartoonish look.

4

(a) (b)

Figure 1.3: Examples of humanoid heads with different appearance themes. (a) Albert HUBO (anthropomorphic) (Oh et al., 2006) (b) Flobi (iconic) (Lutkebohle et al., 2010)

One preliminary design issue needs to be addressed is on which appearance theme to be selected. Is it anthropomorphic or iconic? To make decision on this matter, it is useful to consider the Uncanny Valley theory by Mori (1970) as graphed by him and translated by MacDorman and Minato (2005) as shown in Figure 1.4. The theory is summarized by Bartneck, Kanda, Ishiguro and Hagita (2007) as follows:

The more human-like robots become in appearance and motion, the more positive the humans' emotional reactions towards them become. This trend continues until a certain point is reached beyond which the emotional responses quickly become negative. As the appearance and motion become indistinguishable from humans the emotional reactions also become similar to the ones towards real humans. When the emotional reaction is plotted against the robots' level of anthropomorphism, a negative valley becomes visible and is commonly referred to as the uncanny valley.

From the aforementioned statement, such negative valley can be explained as the eerie sensation that a human feel towards robots with anthropomorphic looks. As

MacDorman and Ishiguro (2006) explained, the robots are acting as a reminder of mortality. Even though the theory continuously becomes the subject of discussion and research, the anthropomorphism pitfall as suggested by the theory is a consequence to be avoided and thus cartoonish theme is a safe option to design a humanoid head.

5

Figure 1.4: Uncanny Valley Theory (Mori (1970), translated by MacDorman and Minato (2005))

1.2 PROBLEM STATEMENT AND ITS SIGNIFICANCE

To become effectively expressive, that is to have facial expressions which are recognisable from human perceptions, facial characteristics of a humanoid robot is the key factor. Typically, a humanoid head with more complete facial cues such as WE-

4RII (Itoh et al., 2004) of WU as shown in Figure 1.5(a) and Nexi MDS (Lee &

Breazeal, 2010) of MIT as shown in Figure 1.5 (b) which possess eyebrows, eyelids, eyes and mouth would provide more clues to human beings about its emotional states as compared to another humanoid head with only a pair of eyes and eyelids like Meka

S1 (Meka Robotics, 2010) as shown in Figure 1.5(c). Therefore, having more complete facial cues would increase the likelihood for an expressive humanoid head to be correctly recognized by humans.

6

(a) (b) (c)

Figure 1.5: Recent humanoid heads. From left to right: (a) WE-4RII (Takanishi Lab of WU, 2006) (b) Nexi MDS (MIT Media Lab, n.d.) (c) Meka S1 (Meka Robotics, 2010)

The mouth part in particular, is an important facial cue for an expressive humanoid head. According to Koda, Nakagawa, Tabuchi and Ruttkay (2010), the mouth region is more effective in conveying the emotions of facial expressions than the eye region. However, to design the mouth of an iconic humanoid head which could morph into different mouth shape as a human mouth does is essentially complicated. Mouth lips from flexible materials such as rubbers or silicones could form a wide range of mouth shapes. However the required number of mouth actuators is considerably high (more than 3) as in WE-4RII (5 DOFs) (Takanishi Lab of WU,

2006) and Flobi (6 DOFs) (Lutkebohle et al., 2010). The 3-DOF movable jaw of Nexi

MDS (Lee & Breazeal, 2010) is a fairly simple mouth mechanism. However, when the jaw is tilted to the left or right it is unclear as whether the mouth forms a smiling or frowning look which could then confuse humans to interpret its expressions. While higher DOFs could mean a more expressive mouth, such design would consequently requires more actuators and thus more weights to be loaded onto the humanoid head.

7