Context-Based Innovative Mobile User Interfaces Yun Zhou

Total Page:16

File Type:pdf, Size:1020Kb

Context-Based Innovative Mobile User Interfaces Yun Zhou Context-based Innovative Mobile User Interfaces Yun Zhou To cite this version: Yun Zhou. Context-based Innovative Mobile User Interfaces. Human-Computer Interaction [cs.HC]. Ecole Centrale de Lyon, 2012. English. tel-00813532 HAL Id: tel-00813532 https://tel.archives-ouvertes.fr/tel-00813532 Submitted on 15 Apr 2013 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. N° d’ordre 2012-29 Année 2012 Thèse Context-based Innovative Mobile User Interfaces Présentée devant Ecole Centrale de Lyon Pour obtenir Le grade de docteur Ecole Doctorale Informatique et Information pour la Société (EDIIS) Par Zhou Yun Soutenue le 08 Octobre 2012 devant la Commission d’examen Jury MM. Christophe KOLSKI Pr. (UVHC) Rapporteur José ROUILLARD MdC HDR (Univ. Lille 1) Rapporteur Patrick GIRARD Pr. ( Univ. de Poitiers) Examinateur Franck TARPIN-BERNARD Pr. (UJF Grenoble) Examinateur Bertrand DAVID Pr. (ECL) Directeur René CHALON MdC (ECL) Co-directeur 2 Acknowledgements During the three years I spent working on this thesis, many people most kindly helped me in my everyday life and in the success of my research work here in France. First, I have been greatly fortunate to be a student working with my supervisor Professor Bertrand David, who always provided me with insightful comments, pa- tiently answered all my questions, and guided me step by step in my research work. He also provided me with op- portunities to get in touch with other researchers in inter- national conferences, kindly supporting me in many ways during my study period. It would have been impossible for me to enjoy my interesting research and achieve the goal of my thesis without his help. I would also like to thank my co-supervisor René Chalon, who kindly answered all my questions concerning programming, research prob- lems, and other technical problems, always providing me with useful solutions. I always obtained the inspiration and found the right solution after my discussions with him. My two supervisors are both excellent researchers in our field and I hope that my graduation will not mean the end of our collaboration and cooperation. Second, I would like to heartily thank my colleagues Xu Tao, Florent Delomier, Cyril Bénazeth, Zhang Bingxue, and Jin Huiliang, who always exchanged their interesting research ideas with me, and generously gave me their time to help me prepare every evaluation in my thesis. I especially would like to thank Tao, my dear friend, who always encouraged me when I got frustrated. And I want to thank many Chinese students in ECL for their invalua- 3 ble contribution in my evaluation: Zhu Chao, Li Huibin, Zhang Taiping, Feng Zhenfu, Zhu Nanhao, Ye Fanbin, Lv Ying, etc. Third, I would like to specially thank Professor Chris- tophe Kolski and associate professor José Rouillard, for having agreed to be reviewers of this thesis. They helped me improve the manuscript and gave me a lot of helpful comments. I also thank all other members of the jury, Professor Patrick Girard and Professor Franck Tarpin- Bernard, for having accepted to participate in the defense and evaluate work in the thesis. Finally, I really want to thank my dear parents, who live in China but always gave me their selfless love, im- measurable sacrifices, and incessant encouragement. I al- so appreciate all the support and encouragement from my relatives, French teachers and friends in France, in China and all over the world. Without them, I could not progress as I have made. 4 Abstract With the development of a wide variety of sensors and devices, computing is no longer limited to the desktop mode. However, the traditional user interface, used on the desktop computer, is no longer appropriate for ubiqui- tous computing. A sophisticated mobile environment re- quires dedicated design of interfaces, involving input and output techniques with new emerging features that go far beyond the capacities of traditional techniques. One of the solutions to enable ubiquitous interaction and end limita- tion of the desktop mode is nomadism, while another is mobility. We propose three interfaces related to these two solutions: In-environment interface (IEI), Environment Dependent Interface (EDI), and Environment Independent Interface (EII). We exclude IEI and mainly focus on wear- able interaction. This thesis aims to investigate research issues in- volved in the design, implementation and evaluation of EDI and EII. It presents our design approach to these three innovative interfaces (IEI, EDI and EII), their wear- able configurations (camera-glasses device unit and cam- era-projector device unit), real examples of use (including the Research Team Interaction Scenario), and both the quantitative and qualitative user studies and evaluations to prove the feasibility and usability of our prototypes. Our work is a many-sided investigation on innovative wearable interfaces, as well as input and output tech- niques, which will pave the way for future research into wearable interfaces. 5 6 Table des matières 7 Sommaire 1 Introduction 19 1.1 Introduction 19 1.2 Motivation and Contributions 19 1.3 Thesis Structure 23 2 Related Work 25 2.1 Introduction 25 2.2 Wearable Interaction 26 2.2.1 History of Wearable Computing 26 2.2.2 Wearable Computing Applications and Scenarios 29 2.2.3 New Advances in Wearable Computing 30 2.3 Vision-based Hand Input Techniques 30 2.3.1 Hand Gesture Recognition Technology 31 2.3.2 Vision-based Hand-gesture Interaction 32 2.3.3 Applications and Requirements of Design 34 2.4 Visual Output Techniques 34 2.4.1 Head-worn Display (HMD) 34 2.4.2 The Personal Projector 37 2.4.21 Miniaturization of Projection 37 2.4.22 Personal Projector Interaction 38 2.4.23 Property of Scalability 42 2.4.24 Social Issues on Projection 43 2.5 Related Research Areas 43 2.5.1 Augmented Reality (AR) and Mobile AR 44 2.5.2 Ubiquitous Computing and Always-Available Mobile Interaction 45 2.5.3 Tangible User Interface (TUI) and Marker-based Interaction 46 8 Sommaire 2.6 Summary 48 3 Innovative User Interfaces in Augmented Environment 49 3.1 Introduction 49 3.2 Overview of Innovative User Interfaces 50 3.3 Design of EDI and EII 52 3.3.1 Environment Dependent Interface 52 3.3.2 Environment Independent Interface 54 3.4 Light Mobile User Interfaces 55 3.4.1 An In-environment and Fixed Interaction Support 56 3.4.2 Mobile Environment Dependent Interaction 57 3.5 Scenarios and Applications 60 3.5.1 Environment Dependent Interface Applications 61 3.5.2 Environment Independent Interface Applications 62 3.6 Evaluation of Selection Techniques 63 3.7 Summary 65 4 Paper-based Interfaces for Mobile Interactions 67 4.1 Introduction 67 4.2 MobilePaperAccess 68 4.2.1 Input Techniques 68 4.2.2 Paper Surface 69 4.3 Implementation 70 4.3.1 Augmented Paper 71 4.3.2 Goggle with Small Screen 72 9 Sommaire 4.3.3 Motion of Finger and Mask 72 4.4 Research Team Management Application (RTMA) 73 4.5 User Study 73 4.5.1 Participants 74 4.5.2 Procedure 74 4.5.3 Variables 75 4.6 Main Results 75 4.6.1 Interaction Time 75 4.6.2 Access Time 76 4.6.3 Interaction Errors 77 4.6.4 User Satisfaction 77 4.6.5 User Comments 78 4.7 Discussions 78 4.8 Summary 79 5 Wearable One-hand Gesture Input and Scalable Projected Interface 81 5.1 Introduction 81 5.2 Overview of Camera-projector Interaction Techniques 82 5.2.1 Scalable Projected Interfaces 83 5.2.11 Reference-cell and Scalability Threshold 83 5.2.12 Process from Application Tasks to Scalable Interfaces 85 5.2.2 Hover Input Technique 86 5.3 Implementation 87 5.3.1 Wearable Configuration 87 5.3.2 Recognition of Hover Gesture 88 5.3.3 Auto-calibration of Projector and Camera Coordinates 89 5.3.4 Depth Sensing via ARToolKit Tags 90 10 Sommaire 5.4 Research Team Interaction System (RTIS) 91 5.4.1 RTIS Scenario 91 5.4.2 Scalable Interface Creation Process 92 5.5 User Study 95 5.5.1 Questions for Hover Gesture 95 5.5.2 Questions for Scalable Interface 96 5.5.3 Participants and Procedure 96 5.6 Results and Findings 98 5.6.1 Results on Hover Gesture 98 5.6.2 Results on Scalable Interface 99 5.6.21 Interaction Time 99 5.6.22 User Preference 99 5.6.23 User Comments on the Situations 100 5.7 Discussions 102 5.8 Summary 103 6 Wearable Interaction Using Hand Gestures 105 6.1 Introduction 105 6.2 Interaction Techniques 106 6.2.1 Pinch-gesture-based Interaction 106 6.2.2 Hand-gesture-based Interaction 108 6.2.3 Items in Projected Interface 109 6.3 Wearable Configuration and Implementation 109 6.3.1 Head-worn Configuration 109 6.3.2 Auto-calibration of Projector and Camera Coordinates 110 6.3.3 Recognition of Hover Gesture and Pinch Gesture 111 6.3.4 Recognition of Fist-palm Gesture 113 6.4 Prototype of Research Team Interaction System (RTIS) 114 11 Sommaire 6.5 User Study 115 6.5.1 Participants 117 6.5.2 Procedure 117 6.5.3 Variables 118 6.5.4 Errors 118 6.6 Study Results 119 6.6.1 Interaction Time 119 6.6.2 Average Interaction Time 120 6.6.3 Task Completion Time 120 6.6.4 Errors 121 6.6.5 User Satisfaction and Preference 123 6.7 Discussions 127 6.8 Summary 129 7 Physical-digital Mixed Interface 131 7.1 Introduction 131 7.2 Basic Continuum
Recommended publications
  • Neural Systems of Visual Attention Responding to Emotional Gestures
    Neural systems of visual attention responding to emotional gestures Tobias Flaisch a,⁎, Harald T. Schupp a, Britta Renner a, Markus Junghöfer b a Department of Psychology, University of Konstanz, P.O. Box D 36, 78457 Konstanz, Germany b Institute for Biomagnetism and Biosignalanalysis, Münster University Hospital, Malmedyweg 1, 48149 Münster, Germany abstract Humans are the only species known to use symbolic gestures for communication. This affords a unique medium for nonverbal emotional communication with a distinct theoretical status compared to facial expressions and other biologically evolved nonverbal emotion signals. While a frown is a frown all around the world, the relation of emotional gestures to their referents is arbitrary and varies from culture to culture. The present studies examined whether such culturally based emotion displays guide visual attention processes. In two experiments, participants passively viewed symbolic hand gestures with positive, negative and neutral emotional meaning. In Experiment 1, functional magnetic resonance imaging (fMRI) measurements showed that gestures of insult and approval enhance activity in selected bilateral visual associative brain regions devoted to object perception. In Experiment 2, dense sensor event related brain potential recordings (ERP) revealed that emotional hand gestures are differentially processed already 150 ms poststimulus. Thus, the present studies provide converging neuroscientific evidence that emotional gestures provoke the cardinal signatures of selective visual attention regarding brain structures and temporal dynamics previously shown for emotional face and body expressions. It is concluded that emotionally charged gestures are efficient in shaping selective attention processes already at the level of stimulus perception. Introduction their referents is arbitrary, builds upon shared meaning and convention, and consequently varies from culture to culture (Archer, In natural environments, emotional cues guide visual attention 1997).
    [Show full text]
  • Improving Communication Based on Cultural
    IMPROVING COMMUNICATION BASED ON CULTURAL COMPETENCY IN THE BUSINESS ENVIRONMENT by Walter N. Burton III A Dissertation Submitted to the Faculty of The Dorothy F. Schmidt College of Arts and Letters in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy Florida Atlantic University Boca Raton, Florida May 2011 IMPROVING COMMUNICATION BASED ON CULTURAL COMPETENCY IN THE BUSINESS ENVIRONMENT by Walter N. Burton III This dissertation was prepared under the direction of the candidate's dissertation advisor, Dr. Patricia Darlington, School of Communication and Multimedia Studies, and has been approved by the members ofhis supervisory committee. It was submitted to the faculty of The Dorothy F. Schmidt College of Arts and Letters and was accepted in partial fulfillment ofthe requirements for the degree ofDoctor ofPhilosophy. SUPERVISORY COMMITTEE: ~o--b Patricia Darlington, Ph.D. ~.:J~~ Nannetta Durnell-Uwechue, Ph.D. ~~C-~ Arthur S. E ans, J. Ph.D. c~~~Q,IL---_ Emily~ard, Ph.D. Director, Comparative Studies Program ~~~ M~,Ph.D. Dean, The Dorothy F. Schmidt College ofArts & Letters ~~~~ Dean, Graduate College ii ACKNOWLEDGEMENTS I would like to thank the members of my committee Prof. Patricia Darlington, Prof. Nennetta Durnell-Uwechue, Prof. Arthur Evans, and Prof. Angela Rhone for your diligent readings of my work as well as your limitless dedication to the art of teaching. I would like to offer a special thanks to my Chair Prof. Patricia Darlington for your counsel in guiding me through this extensive and arduous project. I speak for countless students when I say that over the past eight years your inspiration, guidance, patience, and encouragement has been a constant beacon guiding my way through many perils and tribulations, along with countless achievements and accomplishments.
    [Show full text]
  • The Three Most Common Cross-Cultural Gestures Allan and Barbara Pease
    The Three Most Common Cross-Cultural Gestures Allan and Barbara Pease 1.The Ring This gesture was popularised in the USA during the early nineteenth century by the newspapers that were starting a craze or using initials to shorten common phrases. There are any different views about what the initials 'OK' originally stood for, some believing it stood or 'all correct' which was regularly misspelled as 'oil korrect', while others say that it means the opposite of 'knock-out' that is, KO. 'OK' to a Westerner, 'money' to a Japanese, 'zero' to the French insulting to the Turks and Brazilians Another popular theory is that it is an abbreviation of 'Old Kinderhook', from the birthplace of a nineteenth-century American president who used the initials as a campaign slogan. It's obvious that the ring itself represents the letter 'O'in the 'OK' signal. The 'OK' meaning is common to all English-speaking countries and its meaning is fast spreading everywhere due to American television and movies, but it has other origins and meanings in certain places. For example, in France and Belgium it also means 'zero' or 'nothing'. In a Paris restaurant one evening, the waiter showed us to our table and asked, 'Is the table OK?' We flashed him the OK signal and he responded, 'Well, if you don't like it here we'll find you another table...' He had interpreted the OK signal as meaning 'zero' or 'worthless' - in other words, he thought we had communicated that we didn't like the table. Use the 'OK' gesture to tell a French person their cooking is wonderful and they'll probably throw you out.
    [Show full text]
  • On the Power of Slurring Words and Derogatory Gestures
    Charged Expressions: On the Power of Slurring Words and Derogatory Gestures Ralph DiFranco, Ph.D. University of Connecticut, 2016 Slurs have the striking power to promulgate prejudice. Standard semantic and pragmatic theories fail to explain how this works. Drawing on embodied cognition research, I show that metaphorical slurs, descriptive slurs, and slurs that imitate their targets are effective means of transmitting prejudice because they are vehicles for prompting hearers to form mental images that depict targets in unflattering ways or to simulate experiential states such as negative feelings for targets. However, slurs are a heterogeneous group, and there may be no one mechanism by which slurs harm their targets. Some perpetrate a visceral kind of harm – they shock and offend hearers – while others sully hearers with objectionable imagery. Thus, a pluralistic account is needed. Although recent philosophical work on pejoratives has focused exclusively on words, derogation is a broader phenomenon that often constitutively involves various forms of non- verbal communication. This dissertation leads the way into uncharted territory by offering an account of the rhetorical power of iconic derogatory gestures and other non-verbal pejoratives that derogate by virtue of some iconic resemblance to their targets. Like many slurs, iconic derogatory gestures are designed to sully recipients with objectionable imagery. I also address ethical issues concerning the use of pejoratives. For instance, I show that the use of slurs for a powerful majority
    [Show full text]
  • T0087 Information Collection and Exchange
    Culture Matters The Peace Corps Cross-Cultural Workbook Peace Corps Information Collection and Exchange Publication No. T0087 Information Collection and Exchange The Peace Corps Information Collection and Exchange (ICE), a unit of the Office of Overseas Programming and Training Support (OPATS), makes available the strategies and technologies developed by Peace Corps Volunteers, their co-workers, and their counterparts to development organizations and workers who might find them useful. ICE works with Peace Corps technical and training specialists to identify and develop information of all kinds to support Volunteers and overseas staff. ICE also collects and disseminates training guides, curricula, lesson plans, project reports, manuals, and other Peace Corps-generated materials developed in the field. Some materials are reprinted “as is”; others provide a source of field-based information for the production of manuals or for research in particular program areas. Materials that you submit to ICE become part of the Peace Corps’ larger contribution to development. This publication was produced by Peace Corps OPATS. It is distributed through the ICE unit. For further information about ICE materials, or for additional copies of this manual, please contact ICE and refer to the ICE Publication Number. Peace Corps Office of Overseas Programming and Training Support Information Collection and Exchange 1111 20th Street, NW, Sixth Floor Washington, DC 20526 [email protected] Tel: 202.692.2640 Fax: 202.692.2641 Abridged Dewey Decimal Classification (DDC) Number: 303.48 Share your experience! Add your experience to the ICE Resource Center. Send your materials to us so we can share them with other development workers.
    [Show full text]
  • The Oxygen of Amplification Better Practices for Reporting on Extremists, Antagonists, and Manipulators Online
    The Oxygen of Amplification Better Practices for Reporting on Extremists, Antagonists, and Manipulators Online By Whitney Phillips EXECUTIVE SUMMARY MAPPING THE MEDIA ECOSYSTEM We live in a time where new forms of power are emerging, where social and digital media are being leveraged to reconfigure the information landscape. This new domain requires journalists to take what they know about abuses of power and media manipulation in traditional information ecosystems and apply that knowledge to networked actors, such as white nationalist networks online. These actors create new journalistic stumbling blocks that transcend attempts to manipulate reporters solely to spin a beneficial narrative – which reporters are trained to decode – and instead represent a larger effort focused on spreading hateful ideology and other false and misleading narratives, with news coverage itself harnessed to fuel hate, confusion, and discord. The choices reporters and editors make about what to cover and how to cover it play a key part in regulating the amount of oxygen supplied to the falsehoods, antagonisms, and manipulations that threaten to overrun the contemporary media ecosystem—and, simultaneously, threaten to undermine democratic discourse more broadly. This context demands that journalists and the newsrooms that support them examine with greater scrutiny how these actors and movements endeavor to subvert journalism norms, practices, and objectives. More importantly, journalists, editors, and publishers must determine how the journalistic rule set must be strengthened and fortified against this newest form of journalistic manipulation—in some cases through the rigorous upholding of long-standing journalistic principles, and in others, by recognizing which practices and structural limitations make reporters particularly vulnerable to manipulation.
    [Show full text]
  • The Nonverbal Code
    08-Neuliep (4849).qxd 11/24/2005 10:52 AM Page 285 Person A’s Socio- Person B’s Perceptual Relational Perceptual Context Context Context CHAPTER Environmental Context 8 Microcultural Context Cultural Context The Nonverbal Code Speakers of every language accompany their words with nonverbal signals that serve to mark the structure of their utterances. —Peter Farb1 Chapter Objectives After reading this chapter, you should be able to 1. Define nonverbal communication. 2. Compare and contrast verbal and nonverbal codes. 3. Define kinesics and provide examples of kinesic behavior across cultures. 4. Define paralanguage and provide cross-cultural examples of para- linguistic differences. 5. Define proxemics and provide cross-cultural examples of proxemic differences. 6. Define haptics and provide cross-cultural examples of haptic differences. 7. Define olfactics and discuss how smell is perceived across cultures. 8. Define chronemics and discuss how time is perceived across cultures. 9. Recount the fundamental assumptions of the nonverbal expectancy violation theory. ▼ –285 08-Neuliep (4849).qxd 11/24/2005 10:52 AM Page 286 ▼ 286––INTERCULTURAL COMMUNICATION any linguists, psychologists, and sociologists believe that human language evolved from a system of nonlinguistic (non- M verbal) communication. To these scholars, language and com- munication are not the same. Humans possess a host of nonlinguistic ways to communicate with each other through the use of their hands, arms, face, and personal space. When we combine verbal and nonverbal language, we create an intricate communication system through which humans come to know and understand each other.2 All animals communicate nonlinguistically—that is, nonverbally—through sight, sound, smell, or touch.
    [Show full text]
  • DIS 2016, June 4–8, 2016, Brisbane, Australia
    Body Motion/Mobile DIS 2016, June 4–8, 2016, Brisbane, Australia Designing Mid-Air TV Gestures for Blind People Using User- and Choice-Based Elicitation Approaches Nem Khan Dim‡, Chaklam Silpasuwanchai*, Sayan Sarcar‡, Xiangshi Ren*‡ [email protected], fchaklam, sayan.sarcar, [email protected] *Center for Human-Engaged ‡School of Information, Kochi Computing, Kochi University of University of Technology, Technology, Kochi, Japan Kochi, Japan ABSTRACT than 7 hours per week. Furthermore, the interviews informed Mid-air gestures enable intuitive and natural interactions. us that blind people often have difficulty finding their remote However, few studies have investigated the use of mid-air controls or figuring out the button layout. Such problems gestures for blind people. TV interactions are one promis- suggest that employing mid-air gestures for interaction with ing use of mid-air gestures for blind people, as “listening” TVs is a useful alternative method. to TV is one of their most common activities. Thus, we in- To design usable and accessible gestures, we determined that vestigated mid-air TV gestures for blind people through two the first important step was to understand the mental mapping studies. Study 1 used a user-elicitation approach where blind of blind users when performing mid-air gestures. Thus, we people were asked to define gestures given a set of commands. investigated the mid-air gesture preferences of blind people Then, we present a classification of gesture types and the fre- for TV interaction using a user-elicitation approach. With the quency of body parts usage. Nevertheless, our participants user-elicitation approach, we were able to formulate a clas- had difficulty imagining gestures for some commands.
    [Show full text]
  • Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue
    sensors Article Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue Chang Liu 1,* and Tamás Szirányi 1,2,* 1 Department of Networked Systems and Services, Budapest University of Technology and Economics, BME Informatika épület Magyar tudósok körútja 2, 1117 Budapest, Hungary 2 Machine Perception Research Laboratory of Institute for Computer Science and Control (SZTAKI), Kende u. 13-17, 1111 Budapest, Hungary * Correspondence: [email protected] (C.L.); [email protected] (T.S.); Tel.: +36-70-675-4164 (C.L.); +36-20-999-3254 (T.S.) Abstract: Unmanned aerial vehicles (UAVs) play an important role in numerous technical and scientific fields, especially in wilderness rescue. This paper carries out work on real-time UAV human detection and recognition of body and hand rescue gestures. We use body-featuring solutions to establish biometric communications, like yolo3-tiny for human detection. When the presence of a person is detected, the system will enter the gesture recognition phase, where the user and the drone can communicate briefly and effectively, avoiding the drawbacks of speech communication. A data-set of ten body rescue gestures (i.e., Kick, Punch, Squat, Stand, Attention, Cancel, Walk, Sit, Direction, and PhoneCall) has been created by a UAV on-board camera. The two most important gestures are the novel dynamic Attention and Cancel which represent the set and reset functions respectively. When the rescue gesture of the human body is recognized as Attention, the drone will gradually approach the user with a larger resolution for hand gesture recognition. The system achieves 99.80% accuracy on testing data in body gesture data-set and 94.71% accuracy on testing data in hand gesture data-set by using the deep learning method.
    [Show full text]
  • Essential Things to Know About Gestures and Body Language
    c01.qxp 7/16/07 9:21 AM Page 7 1 Essential Things to Know about Gestures and Body Language Sixty percent of our daily communication is nonverbal. —Edward T. Hall Are simple hand gestures and body movements important? Here are some answers: It’s inaugural day in the United States, 2005. Presi- dent George W. Bush is in the reviewing stand on Washington, D.C.’s Pennsylvania Avenue as the UniversityCOPYRIGHTED of Texas marching band MATERIAL passes by. He raises his hand to salute his alma mater with the time-honored “hook ’em horns” sign—fist raised upright, index finger and pinkie sticking up, the sign of the horns of a Texas longhorn steer, the mascot and symbol of the University of Texas. Bush’s picture appears on TV screens around the globe . and many people in countries around the world are immediately insulted! 7 c01.qxp 7/16/07 9:21 AM Page 8 8 ESSENTIAL DO’S AND TABOOS That very same gesture—fist upraised, index and little fingers extended upward—is considered rude in certain other countries. • In Italy, it means “Your wife is cheating on you!” “You are being cuckolded.” • In some parts of Africa, you are issuing a curse. • In Norway, the Internet newspaper Nettavisen expressed outrage that not only Bush, but his wife and two daughters, would issue such an insult. • Yet the gesture can also have positive meanings. In the Mediterranean Sea, fishing boats may have this symbol painted on their bows to ward off evil, and in Brazil, women often wear gold or silver lockets with this sign as a good luck amulet.
    [Show full text]
  • A System for a Hand Gesture-Manipulated Virtual Reality Environment
    A System for a Hand Gesture-Manipulated Virtual Reality Environment Andrew Clark Deshendran Moodley UKZN/CSIR Meraka Centre for Artificial CSIR Meraka Centre for Artificial Intelligence Intelligence Research (CAIR) Research (CAIR) University of KwaZulu-Natal, Westville Campus, University of Cape Town, Cape Town, South Durban, South Africa Africa [email protected] [email protected] ABSTRACT Extensive research has been done using machine learning techniques for hand gesture recognition (HGR) using camera- based devices; such as the Leap Motion Controller (LMC). However, limited research has investigated machine learn- ing techniques for HGR in virtual reality applications (VR). This paper reports on the design, implementation, and eval- uation of a static HGR system for VR applications using the LMC. The gesture recognition system incorporated a lightweight feature vector of five normalized tip-to-palm dis- tances and a k-nearest neighbour (kNN) classifier. The sys- tem was evaluated in terms of response time, accuracy and usability using a case-study VR stellar data visualization application created in the Unreal Engine 4. An average ges- ture classification time of 0.057ms with an accuracy of 82.5% Figure 1: The Oculus Rift DK2 with the LMC and was achieved on four distinct gestures, which is comparable its mount. with previous results from Sign Language recognition sys- tems. This shows the potential of HGR machine learning techniques applied to VR, which were previously applied to gestures as the primary input, thus there is a need to ex- non-VR scenarios such as Sign Language recognition. plore different methods of capturing hand data.
    [Show full text]
  • Real-Time Gesture Translation in Intercultural Communication
    Real-time Gesture Translation in Intercultural Communication Béatrice S. Hasler1, Oren Salomon2, Peleg Tuchman3, Amir Lev-Tov4, and Doron Friedman5 Advanced Reality Lab, Sammy Ofer School of Communications, Interdisciplinary Center Herzliya Kanfei Nesharim St., 46150 Herzliya, Israel 1 Corresponding author; [email protected], Tel: +972 54 524 2926 Fax: +972 9 952 7650 2 [email protected] 3 [email protected] 4 [email protected] 5 [email protected] 1 Abstract Nonverbal behavior plays a crucial role in human communication, and often leads to misunderstandings between people from different cultures, even if they speak the same language fluently. While translation systems are available for verbal communication, translators for nonverbal communication do not exist yet. We present the conceptual design and an early prototype of a real-time gesture translator using body tracking and gesture recognition in avatar-mediated intercultural interactions. It contributes to the ambitious goal of bridging between cultures by translating culture-specific gestures to enhance mutual understanding. Possible applications of the gesture translator are discussed as a facilitating tool for global business meetings and technology-enhanced conflict resolution. Keywords Nonverbal communication, intercultural communication, translation systems, body tracking, gesture recognition, avatars, virtual environments 1. Introduction Globalization offers many opportunities for the establishment of international relations, multicultural networks and workforces. While cultural diversity may enrich our social and work environment, failure to communicate effectively between cultures can become a stumbling block for intercultural meetings (Barna 1994). Although language is the most obvious barrier in intercultural encounters, nonverbal communication (NVC) is also known to vary across cultures (Andersen et al.
    [Show full text]