An Investigation of Mid-Air Gesture Interaction for Older Adults
Total Page:16
File Type:pdf, Size:1020Kb
An Investigation of Mid-Air Gesture Interaction for Older Adults Arthur Theil Cabreira This thesis is submitted for the degree of Doctor of Philosophy - Cybernetics Programme – University of Reading June 2019 2 Declaration I confirm that this is my own work and the use of all material from other sources has been properly and fully acknowledged. Arthur Theil Cabreira Reading, UK June 2019 3 Abstract Older adults (60+) face natural and gradual decline in cognitive, sensory and motor functions that are often the reason for the difficulties that older users come up against when interacting with computers. For that reason, the investigation and design of age- inclusive input methods for computer interaction is much needed and relevant due to an ageing population. The advances of motion sensing technologies and mid-air gesture interaction reinvented how individuals can interact with computer interfaces and this modality of input method is often deemed as a more “natural” and “intuitive” than using purely traditional input devices such mouse interaction. Although explored in gaming and entertainment, the suitability of mid-air gesture interaction for older users in particular is still little known. The purpose of this research is to investigate the potential of mid-air gesture interaction to facilitate computer use for older users, and to address the challenges that older adults may face when interacting with gestures in mid-air. This doctoral research is presented as a collection of papers that, together, develop the topic of ageing and computer interaction through mid-air gestures. The initial point for this research was to establish how older users differ from younger users and focus on the challenges faced by older adults when interacting with mid-air gesture interaction. Once these challenges were identified, this work aimed to explore a series of usability challenges and opportunities to further develop age-inclusive interfaces based on mid-air gesture interaction. Through a series of empirical studies, this research intends to provide recommendations for designing mid-air gesture interaction that better take into consideration the needs and skills of the older population and aims to contribute to the advance of age-friendly interfaces. 4 Acknowledgements I thank all the volunteers for their participation throughout the research, the University of Reading’s Hugh Sinclair Unit and Psychology department for their help with recruitment. I would also like to thank my supervisor Dr. Faustina Hwang for her guidance, support and patience during the course of my doctoral research. The research is funded by the Brazilian Science Without Borders Programme, an initiative under the Ministry of Education and CAPES Foundation. Project number BEX 13037-13-7. 5 Table of Contents 1 Introduction ..14 1.1 Motivation ..16 1.2 Research Questions ..18 1.3 Methods and Materials. ..19 1.4 Thesis Structure. ..21 References ..24 2 General Literature Review ..25 2.1 Population Ageing ..26 2.2 Older Adults and Computers ..27 2.3 Gestures ..29 2.4 Gesture Recognition Technologies. ..30 2.5 Research Agenda. ..33 References ..35 3 Investigating Age-Related Differences in How Novice Users Perform and Perceive Mid-Air Gestures as an Input Method for Computer Interaction ..38 Abstract ..39 1 Introduction ..39 2 Related Work ..44 2.1 Older adults, gestures and motion control ..44 2.1.1 Touchscreen gestures and older users ..44 2.1.2 Motion control for TV and gaming ..44 2.2 Age-related declines and their potential challenges for mid-air gesture interaction ..46 2.2.1 Motor Decline and Fatigue ..46 2.2.2 Cognitive Decline, Learnability and Memorability of Gestures ..47 2.2.3 Feedback, Sensory Decline and (lack of) Haptics ..48 3 Methods ..49 3.1 Selecting the gesture vocabulary ..50 3.2 Participants ..52 3.3 Apparatus ..53 3.4 Procedure ..53 3.4.1 Part 1: Guessability Study ..53 3.4.2 Part 2: Task-based Study ..55 3.5 Analysis ..56 3.5.1 Part 1: Guessability Study ..56 3.5.2 Part 2: Task-based Study ..58 4 Results – Part 1 (Guessability Study) ..58 4.1 Agreement Scores ..59 4.2 Perceived Easiness ..60 6 4.3 Guessability vs Perceived Easiness ..61 5 Results – Part 2 (Task-Based Study) ..61 5.1 Task completion times ..62 5.2 Number of attempts per task ..63 5.3 Help needed per task ..63 5.4 User Experience and Subjective Evaluation ..64 5.4.1 Perceived easiness of each task ..64 5.4.2 Experience ratings ..64 5.4.3 Positive aspects of mid-air gesture interaction ..65 5.4.4 Negative aspects of mid-air gesture interaction ..66 5.5 Observational findings ..67 5.5.1 Presence of upper limb fatigue ..67 5.5.2 Understanding the sensor’s field of view and range of motion ..68 5.5.3 Learnability ..69 5.5.4 Gesture recognition accuracy and consistency ..70 5.6 Technology Acceptance and Intent-to-Use ..70 6 Discussion ..72 6.1 Part 1 – Guessability of mid-air gestures ..72 6.2 Part 2 – Performance and acceptance of mid-air gestures in a task- based context ..75 7 Design Implications ..78 7.1 Designing Suitable Gesture Sets ..78 7.2 Physical Interactions and Ergonomics ..80 7.3 User Interface Design ..83 8 Limitations and Future Work ..86 9 Conclusion ..86 References ..88 4 Text or Image? Investigating the Effects of Instruction Type on Mid-Air Gesture Making with Novice Older Adults ..94 1 Introduction ..95 2 Background ..98 2.1 Older adults and unfamiliar interfaces ..98 2.2 Theoretical background of pictorial interfaces ..99 2.3 Empirical studies and pictorial interfaces ..100 2.4 Learnability of gestures for novice users ..101 3 Methods ..102 4 Results ..105 5 Discussion ..109 6 Future Work ..112 7 Conclusion ..112 5 Movement Characteristics and Effects of GUI Design on How Older Adults Swipe in Mid-Air ..116 1 Introduction and Background ..117 2 Experiment ..118 3 Results ..119 7 4 Discussion and Conclusion ..121 6 Evaluating the Effects of Feedback Type on Older Adults’ Performance in Mid- Air Pointing and Target Selection ..125 1 Introduction ..126 2 Related Work ..129 3 Experiment ..134 4 Results ..138 7 Investigating the Suitability and Effectiveness of Mid-Air Gestures Co-Designed By and For Older Adults ..155 1 Introduction ..156 2 Related Work ..160 3 Study 1 – Co-Designing Mid-Air Gestures for Microinteractions with Older Adults ..162 4 Study 2 – Comparing the Suitability and Effectiveness of Co-Designed Mid-Air Gesture Input with Standard Interaction Methods ..167 5 Discussion ..178 6 Conclusion ..182 8 General Discussion and Conclusions ..190 8.1 Summary of Findings ..191 8.2 Research Questions Revisited ..195 8.3 General Guidelines for Designing Age-Inclusive Gesture Interaction ..198 8.4 Theoretical and Practical Implications ..200 8.5 Summary of Contributions ..203 8.6 Limitations and Future Work ..204 Full List of References ..208 8 List of Figures Chapter 1 Figure 1 Thesis structure and list of studies. ..21 Chapter 2 Figure 1 The Microsoft Kinect. ..31 Figure 2 The Leap Motion. .32 Figure 3 Myo Armband and Microsoft HoloLens ..33 Chapter 3 Figure 1 Study settings. Participants interacting with the Leap Motion sensor. ..53 Figure 2 Graphic representation of the most recurrent gesture forms elicited by 42 participants for each of the 15 mid-air gesture referents. ..59 Figure 3 Group means of perceived easiness ratings for each mid-air gesture in part 1 of the study (5=Very easy, 1=Very difficult). ..61 Figure 4 Group means of perceived easiness ratings for each task in part 2 of the study (5=Very easy, 1=Very difficult). ..65 Figure 5 Number of participants responding either “Strongly agree” or “Agree”. ..65 Figure 6 Number of participants responding either “Strongly agree” or “Agree”. ..71 Chapter 4 Figure 1 Examples of on screen instructions type for the “finger rotation” gesture: (a) descriptive (text-based), (b) pictorial (static), and (c) pictorial (animated). ..103 Figure 2 Study design diagram: gestures list and counterbalancing system. ..103 Figure 3a-3c Participant making a “swipe” gesture based on a descriptive (text-based) instruction [top], a “finger rotation” gesture based on a pictorial (static) instruction [middle], and a “grab” gesture based on a pictorial (animated) instruction [bottom]. ..105 9 Figure 4 Average percentage of mid-air gestures made correctly at first attempt based on three instruction types. Error bars indicate standard errors. ..106 Figure 5 Average time in milliseconds taken to make gestures correctly for each of the three instruction types. Error bars indicate standard errors. ..107 Figure 6 Average perceived easiness ratings for each of the three instruction types (1 – Very difficult; 5 – Very easy). ..108 Figure 7 Number of participants responding to (1) most, (2) neither the most nor least, or (3) least preferred type of instruction for making mid-air gestures. ..109 Chapter 5 Figure 1 Carousel navigation and experimental set-up. ..119 Figure 2 Swipe start and end positions (left to right) in mid-air for (a) large, (b) medium, and (c) small on-screen carousel-style menus. Arrows represent an average of 10 swipes per participant. Units are in mm within the Leap Motion’s coordinate frame. ..121 Chapter 6 Figure 1 Point-to-select interaction paradigm example. ..127 Figure 2 Participant pointing in mid-air and selecting an on-screen target by making a pinch gesture while receiving haptic feedback through a wearable wristband. ..135 Figure 3 Target selection and feedback type scheme. ..138 Figure 4 Average time (ms) for selecting on-screen targets (n=20) by making a pinch gesture in mid-air. Error bars represent S.E. ..139 Figure 5 Average number of gesture attempts participants had to make to select a target successfully.