Motion-Based Interaction for Head-Mounted Displays
Total Page:16
File Type:pdf, Size:1020Kb
Motion-based Interaction for Head- Mounted Displays Thesis submitted in accordance with the requirements of the University of Liverpool for the degree of Doctor in Philosophy By Wenge Xu May 2021 PGR Declaration of Academic Honesty NAME (Print) Wenge Xu STUDENT NUMBER 201324739 SCHOOL/INSTITUTE University of Liverpool TITLE OF WORK Motion-based Interaction for Head-Mounted Displays This form should be completed by the student and appended to any piece of work that is submitted for examination. Submission by the student of the form by electronic means constitutes their confirmation of the terms of the declaration. Students should familiarise themselves with Appendix 4 of the PGR Code of Practice: PGR Policy on Plagiarism and Dishonest Use of Data, which provides the definitions of academic malpractice and the policies and procedures that apply to the investigation of alleged incidents. Students found to have committed academic malpractice will receive penalties in accordance with the Policy, which in the most severe cases might include termination of studies. STUDENT DECLARATION I confirm that: • I have read and understood the University’s PGR Policy on Plagiarism and Dishonest Use of Data. • I have acted honestly, ethically and professionally in conduct leading to assessment for the programme of study. • I have not copied material from another source nor committed plagiarism nor fabricated, falsified or embellished data when completing the attached material. • I have not copied material from another source, nor colluded with any other student in the preparation and production of this material. • If an allegation of suspected academic malpractice is made, I give permission to the University to use source-matching software to ensure that the submitted material is all my own work. SIGNATURE………………………………………………………………………… DATE………………………………………………………………………………… ii Abstract Recent advances in affordable sensing technologies have enabled motion-based interaction (MbI) for head-mounted displays (HMDs). Unlike traditional input devices like the mouse and keyboard, which often offer comparatively limited interaction possibilities (e.g., single-touch interaction), MbI does not have these constraints and is more natural because they reflect more closely people do things in real life. However, several issues exist in MbI for HMDs due to the technical limitations of the sensing and tracking devices, higher degrees of freedom afforded to users, and limited research in the area due to the rapid advancement of HMDs and tracking technologies. This thesis first outlines four core challenges in the design space of MbI for HMDs: (1) boundary awareness for hand-based interaction, (2) efficient hands-free head-based interface for HMDs, (3) efficient and feasible full-body interaction for general tasks with HMDs, and (4) accessible full-body interaction for applications in HMDs. Then, this thesis presents an investigation into the contributions of these challenges in MbI for HMDs. The first challenge is addressed by providing visual feedback during interaction tailored for such technologies. The second challenge is addressed by using a circular layout with a go-and-hit selection style for head-based interaction using text entry as the scenario. In addition, this thesis explores additional interaction mechanisms that leverage the affordances of these techniques, and in doing so, we propose directional full-body motions as an interaction approach to perform general tasks with HDMs as an example to address the third challenge. The last challenge is addressed by (1) exploring the differences between performing full-body interaction for HMDs and common displays (i.e., TV) and (2) providing a set of design guidelines that are specific to current and future HMDs. The results of this thesis show that: (1) visual methods for boundary awareness can help with mid-air hand-based interaction in HMDs; (2) head-based interaction and interfaces that take advantages of MbI, such as a circular interface, can be very efficient and low error hands-free input method for HMDs; (3) directional full-body interaction can be a feasible and efficient interaction approach for general tasks involving HMDs; (4) full-body interaction for applications in HMDs should be iii designed differently than for traditional displays. In addition to these results, this thesis provides a set of design recommendations and takeaway messages for MbI for HMDs. iv Acknowledgements First and foremost, I would like to thank my supervisor Dr. Hai-Ning Liang for his support, encouragement, and remarkable patience. I never expected to learn so much as a PhD student, and I could not have asked for a better person to work with. Without his guidance and constant feedback, this thesis would not have been possible. Secondly, I sincerely extend my gratitude to my supervisory team (i.e., Prof. Yong Yue, Dr. Bing Wu Berberich, and Mr. Phil Jimmieson) and advisory members (i.e., Dr. Terry Pane, Dr. Xing Huang, and Dr. Jie Zhang) for their feedback and insightful comments on my work. Thirdly, I would like to thank everybody at the X-CHI lab for being great friends, colleagues, and collaborators. Lei, Rongkai, Yiming, and Mengqi, I wish you guys an enjoyable PhD journey. Special thanks to my PhD colleague Diego Monteiro: thank you for everything in the past three years, I cannot wait to see you again in the UK. Fourthly, I am very grateful and owe a lot to my family. I would like to thank my parents, Yunming Xu and Haiyan Tang, for their supports, care, and love. My brother Junkai Xu for having fun with me in my leisure time and holidays. Finally, special thanks to my fiancée Xiaolin Zhao for understanding my choice of doing a PhD and her encouragement and support throughout my PhD journey. Lastly, thank you very much to everyone who helped me during my PhD studies. v Publications and Author’s Contributions Statements Materials from this dissertation have been previously published in the journals and conferences listed below. The publication from which each chapter is derived is noted in the parentheses. 1. Wenge Xu et al., “Pointing and Selection Methods for Text Entry in Augmented Reality Head Mounted Displays,” in Proceedings of 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 2019, pp. 279-288, doi: 10.1109/ISMAR.2019.00026. (Chapter 3) 2. Wenge Xu et al., “Exploring Visual Techniques for Boundary Awareness During Interaction in Augmented Reality Head-Mounted Displays,” in Proceedings of 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 204-211, doi: 10.1109/VR46266.2020.00039. (Chapter 5) 3. Wenge Xu et al., “RingText: Dwell-free and hands-free Text Entry for Mobile Head-Mounted Displays using Head Motions,” in IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 5, pp. 1991-2001, May 2019, doi: 10.1109/TVCG.2019.2898736. (Chapter 6) 4. Wenge Xu et al., “DMove: Directional Motion-based Interaction for Augmented Reality Head-Mounted Displays,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 2019, paper 444, pp. 1–14, doi: 10.1145/3290605.3300674. (Chapter 7) 5. Wenge Xu et al., “Assessing the Effects of a Full-body Motion-based Exergame in Virtual Reality,” in Proceedings of the Seventh International Symposium of Chinese CHI, Xiamen, China, 2019, pp. 1–6, doi: 10.1145/3332169.3333574. (Chapter 8) 6. Wenge Xu et al., “Studying the Effect of Display Type and Viewing Perspective on User Experience in Virtual Reality Exergames,” in Games for Health Journal, 2020, ahead of print, doi: 10.1089/g4h.2019.0102. (Chapter 9) vi Materials that contribute to this dissertation but not used as a chapter: 1. Wenge Xu et al., “Directional Motion-based Interfaces for Virtual and Augmented Reality Head-mounted Displays,” in proceedings of 2018 International Computers, Signals and Systems Conference (ICOMSSC), Dalian, China, 2018, pp. 1-5, doi: 10.1109/ICOMSSC45026.2018.8942021. 2. Difeng Yu et al., “PizzaText: Text Entry for Virtual Reality Systems Using Dual Thumbsticks,” in IEEE Transactions on Visualization and Computer Graphics, 2018, vol. 24, no. 11, pp. 2927-2935, Nov 2018, doi: 10.1109/TVCG.2018.2868581. 3. Difeng Yu et al., “DepthMove: Leveraging Head Motions in the Depth Dimension to Interact with Virtual Reality Head-Worn Displays,” in Proceedings of 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 2019, pp. 103-114, doi: 10.1109/ISMAR.2019.00-20. 4. Wenge Xu et al., “Health Benefits of Digital Videogames for the Aging Population: A Systematic Review,” in Games for Health Journal, 2020, ahead of print, doi: 10.1089/g4h.2019.0130. 5. Wenge Xu et al., “Results and Guidelines From a Repeated-Measures Design Experiment Comparing Standing and Seated Full-Body Gesture-Based Immersive Virtual Reality Exergames: Within-Subjects Evaluation,” in JMIR Serious Games vol. 8, no. 3, pp. e17972, 2020, doi: 10.2196/17972. 6. Wenge Xu et al., “VirusBoxing: A HIIT-based VR Boxing Game,” in Extended Abstracts of the 2020 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY’20 EA), Virtual Event, Canada, 2020, pp 1-5, doi: 10.1145/3383668.3419958. 7. Xueshi Lu et al., “Exploration of Hands-free Text Entry Techniques For Virtual Reality,” in Proceedings of 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Virtual, Brazil, 2020, pp 1-6, doi: 10.1109/ISMAR50242.2020.00061. 8. Wenge Xu et al., “Effect of Gameplay