Context-Based Innovative Mobile User Interfaces Yun Zhou

Context-Based Innovative Mobile User Interfaces Yun Zhou

Context-based Innovative Mobile User Interfaces Yun Zhou To cite this version: Yun Zhou. Context-based Innovative Mobile User Interfaces. Human-Computer Interaction [cs.HC]. Ecole Centrale de Lyon, 2012. English. tel-00813532 HAL Id: tel-00813532 https://tel.archives-ouvertes.fr/tel-00813532 Submitted on 15 Apr 2013 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. N° d’ordre 2012-29 Année 2012 Thèse Context-based Innovative Mobile User Interfaces Présentée devant Ecole Centrale de Lyon Pour obtenir Le grade de docteur Ecole Doctorale Informatique et Information pour la Société (EDIIS) Par Zhou Yun Soutenue le 08 Octobre 2012 devant la Commission d’examen Jury MM. Christophe KOLSKI Pr. (UVHC) Rapporteur José ROUILLARD MdC HDR (Univ. Lille 1) Rapporteur Patrick GIRARD Pr. ( Univ. de Poitiers) Examinateur Franck TARPIN-BERNARD Pr. (UJF Grenoble) Examinateur Bertrand DAVID Pr. (ECL) Directeur René CHALON MdC (ECL) Co-directeur 2 Acknowledgements During the three years I spent working on this thesis, many people most kindly helped me in my everyday life and in the success of my research work here in France. First, I have been greatly fortunate to be a student working with my supervisor Professor Bertrand David, who always provided me with insightful comments, pa- tiently answered all my questions, and guided me step by step in my research work. He also provided me with op- portunities to get in touch with other researchers in inter- national conferences, kindly supporting me in many ways during my study period. It would have been impossible for me to enjoy my interesting research and achieve the goal of my thesis without his help. I would also like to thank my co-supervisor René Chalon, who kindly answered all my questions concerning programming, research prob- lems, and other technical problems, always providing me with useful solutions. I always obtained the inspiration and found the right solution after my discussions with him. My two supervisors are both excellent researchers in our field and I hope that my graduation will not mean the end of our collaboration and cooperation. Second, I would like to heartily thank my colleagues Xu Tao, Florent Delomier, Cyril Bénazeth, Zhang Bingxue, and Jin Huiliang, who always exchanged their interesting research ideas with me, and generously gave me their time to help me prepare every evaluation in my thesis. I especially would like to thank Tao, my dear friend, who always encouraged me when I got frustrated. And I want to thank many Chinese students in ECL for their invalua- 3 ble contribution in my evaluation: Zhu Chao, Li Huibin, Zhang Taiping, Feng Zhenfu, Zhu Nanhao, Ye Fanbin, Lv Ying, etc. Third, I would like to specially thank Professor Chris- tophe Kolski and associate professor José Rouillard, for having agreed to be reviewers of this thesis. They helped me improve the manuscript and gave me a lot of helpful comments. I also thank all other members of the jury, Professor Patrick Girard and Professor Franck Tarpin- Bernard, for having accepted to participate in the defense and evaluate work in the thesis. Finally, I really want to thank my dear parents, who live in China but always gave me their selfless love, im- measurable sacrifices, and incessant encouragement. I al- so appreciate all the support and encouragement from my relatives, French teachers and friends in France, in China and all over the world. Without them, I could not progress as I have made. 4 Abstract With the development of a wide variety of sensors and devices, computing is no longer limited to the desktop mode. However, the traditional user interface, used on the desktop computer, is no longer appropriate for ubiqui- tous computing. A sophisticated mobile environment re- quires dedicated design of interfaces, involving input and output techniques with new emerging features that go far beyond the capacities of traditional techniques. One of the solutions to enable ubiquitous interaction and end limita- tion of the desktop mode is nomadism, while another is mobility. We propose three interfaces related to these two solutions: In-environment interface (IEI), Environment Dependent Interface (EDI), and Environment Independent Interface (EII). We exclude IEI and mainly focus on wear- able interaction. This thesis aims to investigate research issues in- volved in the design, implementation and evaluation of EDI and EII. It presents our design approach to these three innovative interfaces (IEI, EDI and EII), their wear- able configurations (camera-glasses device unit and cam- era-projector device unit), real examples of use (including the Research Team Interaction Scenario), and both the quantitative and qualitative user studies and evaluations to prove the feasibility and usability of our prototypes. Our work is a many-sided investigation on innovative wearable interfaces, as well as input and output tech- niques, which will pave the way for future research into wearable interfaces. 5 6 Table des matières 7 Sommaire 1 Introduction 19 1.1 Introduction 19 1.2 Motivation and Contributions 19 1.3 Thesis Structure 23 2 Related Work 25 2.1 Introduction 25 2.2 Wearable Interaction 26 2.2.1 History of Wearable Computing 26 2.2.2 Wearable Computing Applications and Scenarios 29 2.2.3 New Advances in Wearable Computing 30 2.3 Vision-based Hand Input Techniques 30 2.3.1 Hand Gesture Recognition Technology 31 2.3.2 Vision-based Hand-gesture Interaction 32 2.3.3 Applications and Requirements of Design 34 2.4 Visual Output Techniques 34 2.4.1 Head-worn Display (HMD) 34 2.4.2 The Personal Projector 37 2.4.21 Miniaturization of Projection 37 2.4.22 Personal Projector Interaction 38 2.4.23 Property of Scalability 42 2.4.24 Social Issues on Projection 43 2.5 Related Research Areas 43 2.5.1 Augmented Reality (AR) and Mobile AR 44 2.5.2 Ubiquitous Computing and Always-Available Mobile Interaction 45 2.5.3 Tangible User Interface (TUI) and Marker-based Interaction 46 8 Sommaire 2.6 Summary 48 3 Innovative User Interfaces in Augmented Environment 49 3.1 Introduction 49 3.2 Overview of Innovative User Interfaces 50 3.3 Design of EDI and EII 52 3.3.1 Environment Dependent Interface 52 3.3.2 Environment Independent Interface 54 3.4 Light Mobile User Interfaces 55 3.4.1 An In-environment and Fixed Interaction Support 56 3.4.2 Mobile Environment Dependent Interaction 57 3.5 Scenarios and Applications 60 3.5.1 Environment Dependent Interface Applications 61 3.5.2 Environment Independent Interface Applications 62 3.6 Evaluation of Selection Techniques 63 3.7 Summary 65 4 Paper-based Interfaces for Mobile Interactions 67 4.1 Introduction 67 4.2 MobilePaperAccess 68 4.2.1 Input Techniques 68 4.2.2 Paper Surface 69 4.3 Implementation 70 4.3.1 Augmented Paper 71 4.3.2 Goggle with Small Screen 72 9 Sommaire 4.3.3 Motion of Finger and Mask 72 4.4 Research Team Management Application (RTMA) 73 4.5 User Study 73 4.5.1 Participants 74 4.5.2 Procedure 74 4.5.3 Variables 75 4.6 Main Results 75 4.6.1 Interaction Time 75 4.6.2 Access Time 76 4.6.3 Interaction Errors 77 4.6.4 User Satisfaction 77 4.6.5 User Comments 78 4.7 Discussions 78 4.8 Summary 79 5 Wearable One-hand Gesture Input and Scalable Projected Interface 81 5.1 Introduction 81 5.2 Overview of Camera-projector Interaction Techniques 82 5.2.1 Scalable Projected Interfaces 83 5.2.11 Reference-cell and Scalability Threshold 83 5.2.12 Process from Application Tasks to Scalable Interfaces 85 5.2.2 Hover Input Technique 86 5.3 Implementation 87 5.3.1 Wearable Configuration 87 5.3.2 Recognition of Hover Gesture 88 5.3.3 Auto-calibration of Projector and Camera Coordinates 89 5.3.4 Depth Sensing via ARToolKit Tags 90 10 Sommaire 5.4 Research Team Interaction System (RTIS) 91 5.4.1 RTIS Scenario 91 5.4.2 Scalable Interface Creation Process 92 5.5 User Study 95 5.5.1 Questions for Hover Gesture 95 5.5.2 Questions for Scalable Interface 96 5.5.3 Participants and Procedure 96 5.6 Results and Findings 98 5.6.1 Results on Hover Gesture 98 5.6.2 Results on Scalable Interface 99 5.6.21 Interaction Time 99 5.6.22 User Preference 99 5.6.23 User Comments on the Situations 100 5.7 Discussions 102 5.8 Summary 103 6 Wearable Interaction Using Hand Gestures 105 6.1 Introduction 105 6.2 Interaction Techniques 106 6.2.1 Pinch-gesture-based Interaction 106 6.2.2 Hand-gesture-based Interaction 108 6.2.3 Items in Projected Interface 109 6.3 Wearable Configuration and Implementation 109 6.3.1 Head-worn Configuration 109 6.3.2 Auto-calibration of Projector and Camera Coordinates 110 6.3.3 Recognition of Hover Gesture and Pinch Gesture 111 6.3.4 Recognition of Fist-palm Gesture 113 6.4 Prototype of Research Team Interaction System (RTIS) 114 11 Sommaire 6.5 User Study 115 6.5.1 Participants 117 6.5.2 Procedure 117 6.5.3 Variables 118 6.5.4 Errors 118 6.6 Study Results 119 6.6.1 Interaction Time 119 6.6.2 Average Interaction Time 120 6.6.3 Task Completion Time 120 6.6.4 Errors 121 6.6.5 User Satisfaction and Preference 123 6.7 Discussions 127 6.8 Summary 129 7 Physical-digital Mixed Interface 131 7.1 Introduction 131 7.2 Basic Continuum

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    171 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us