Gesture-Based Head-Mounted Augmented Reality Game Development Using Leap Motion and Usability Evaluation
Total Page:16
File Type:pdf, Size:1020Kb
International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021 GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME DEVELOPMENT USING LEAP MOTION AND USABILITY EVALUATION Pu-Hsuan Chien and Yang-Cheng Lin Department of Industrial Design, National Cheng Kung University No.1, University Road, Tainan, Taiwan ABSTRACT The environment of augmented reality created by head-mounted displays can provide players with an immersive gaming experience and proper gestures can enable the integration of real-world objects with the virtual environment. This study explores the usability of gesture operations applied to games and tries to find new applications using current mature gesture sensors. This study develops a head-mounted augmented reality game based on gesture recognition, and evaluates their usability and impact on the user’s interaction with the virtual interface. It uses the Unity engine and open-source glasses designed by Leap Motion to develop an adventure game, and it contains elements of a first-person shooter game. There is no need for any joystick or touch screen to control the game, as the game relies only on the player's hands to operate through a controller-less interface. The results show that the system usability score is 79.4, which indicates a good usability level. This study may be of importance in clarifying problems that may be encountered in the development of interactive-gesture head-mounted display augmented reality games. Finally, this study offers suggestions for the challenges and future of head-mounted augmented reality games. KEYWORDS Natural User Interface, Gesture, Leap Motion, Head-Mounted Displays, Augmented Reality Game 1. INTRODUCTION Games are a daily pastime for many people, and various ways of playing can affect the mental state of players (Shin, 2019). In the last decade, many human-computer interaction methods ranging from traditional keyboards to motion sensors have provided new methods of interaction. As this technology advances, games can bring players an immersive experience. Head-mounted displays (HMDs) can impart players with an immersive gaming experience. The technology behind head-mounted virtual reality (VR) games is very mature, with several devices based on them available in the market. However, HMD technology is not used as much in augmented reality (AR) games — currently, most augmented reality (AR) games are on mobile platforms, often requiring holding a mobile phone to play. In recent years, there has been an increasing research interest on AR (Gugenheimer et al., 2019), and most of the research on the use of HMDs in AR games has focused on either solving technical problems (Hua, 2017; Xiao & Benko, 2016) or on improving input factors (Xu et al., 2019). In this study, a set of head-mounted display AR games that can be controlled with both hands has been developed to bring players a new gaming experience. There are many gesture recognition sensors available. Among them, Microsoft’s Kinect and Leap Motion have much related research (Cabreira & Hwang, 2015, Vokorokos et al., 2016). Leap Motion allows users to use gestures to control the operation and is currently widely used in sign language recognition, rehabilitation, and interactive game development (Khademi et al., 2014, Potter et al., 2013). To realize the functions of gesture recognition and AR at the same time, we use the AR HMD developed by Leap Motion in 2018 with open-source glasses. Most of the components are 3D printed, so the price is lower than other head-mounted AR glasses. This method broke through the original form of operation and increased the application of AR, opening up more possibilities for human-computer interaction. Unlike the general Leap Motion operation method, it places the sensor on top of the head so it can effectively recognize gestures when moving. 149 ISBN: 978-989-8704-31-3 © 2021 To study the potential of this device, we developed an AR game, considering the player’s operation process and the game interface. The major purpose of this study was to assess the usability of this kind of prototyping game, so participants were asked to rate its usability. Finally, we put forward some problems that may be encountered in the development of this type of game. 2. RELATED WORKS The game development in this study involves many areas, such as the choice of gesture sensors, how the gestures are used in games, and the game interface design. We consider that the application of gestures in current gesture-based game operation has not been fully studied, so we have limited literature to consider in developing games. 2.1 Gesture Sensor To date, there have been many studies on the application of gesture sensors. Previous studies have also compared the advantages and disadvantages of these sensors. Some researchers have used Leap Motion in various fields. For example, Cabreira and Hwang conducted experiments on three sensors and analyzed 250 applications, of which more than half were game programs. Their analysis of 15 gestures on each platform indicated that Kinect is more limited than other devices for finger and hand tracking (Cabreira & Hwang, 2015). Leap Motion recognized a wider range of gestures. The most commonly used gestures for Myo were rotation and waving, which could be used consistently in the various programs. Vokorokos et al. (2016) also conducted experiments on three sensors in their research, using pointing, waving, rotating, and fist to determine the gesture accuracy of the devices. Many tests were conducted on the sensors of the three platforms for each movement. Kinect performed well in waving, while Leap Motion performed well in pointing, rotating, and punching. It was fast and intuitive, but the sensing distance was limited. Myo was also very good in rotation, with the advantage that the user can put their hands down to feel more comfortable. Because it uses muscle detection, motion can be easily detected even if the hands are covered or the device is placed in a pocket. Based on the above research, it can be determined that Kinect is very smooth under large-scale gesture sensing such as waving hands. However, other gestures are not very accurate. Its advantages are that it can detect the skeleton of the face and the whole body and that the detection range is wider. Leap Motion's research on AR applications is far more advanced than those of other manufacturers. (Katahira & Soga, 2015; Wozniak et al., 2016). Leap Motion provides open-source AR HMD glasses, the sensor can be placed in a head-mounted device, which solves the original distance limitation of the sensing, but no current research on these glasses has yet appeared. Based on the above research and viewpoints, and because of the good accuracy of Leap Motion and the availability for purchase, this study uses Leap Motion as the sensing device for the AR glasses. Although the sensing range is somewhat limited, changing the Leap Motion device above the HMDs can effectively address this question. 2.2 Gesture Interaction Applied to Games Existing research on the application of gestures in games, based on their recognition method, can be broadly categorized into two types: visual recognition and motion sensor recognition. Visual recognition primarily relies on Kinect, Leap Motion, and mobile phone lenses for gesture analysis (Pirker et al., 2017; Yeo et al., 2015), while motion sensor recognition uses six-axis sensors or electromyographic signals and usually requires the user to wear additional devices (Esfahlani et al., 2018; Lee et al., 2017).Regarding the research on the use of gesture interactions in games, Silpasuwanchai and Ren (2015) held that in gesture interaction games, not all operational methods are suitable for all players. Therefore, they used user feedback and user-elicitation to get a clear understanding of the actions the players felt were most intuitive and found that various people experience the game differently, and there were also differences in their actions, but there were some gestures that players agreed on. For example, the preferred gesture for shooting was the gesture of using one hand to draw a pistol. 150 International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021 There have been several studies that have investigated the application of Leap Motion in medical, entertainment, and educational fields, such as two-handed drone operation (Fernandez et al., 2016) and hand rehabilitation exercises (Alimanova et al., 2017). The above points out the potential of Leap Motion in the natural user interface. However, little research has been done on the development of in-game applications. Overall, we believe that it has great potential in gaming and entertainment, and we use Leap Motion with AR HMDs to design and develop a game, which will help researchers study the application of human-computer interaction in related game fields in the future. 2.3 Diegetic Interface When only gestures are used, the interface plays a primary role in games. Previous studies have also pointed out that the design of the interface affects the player’s immersion (Brown & Cairns, 2004, Qin et al., 2009). The word “diegesis” was originally a term used in movies to refer to all the stories about the character. Galloway applied this concept to the role of the game. The operation of the role of the game is regarded as diegesis, and non-diegesis refers to the game outside the narrative world. He also distinguished between the behavior of the player and the character. The behavior of the player is divided into diegetic and non-diegetic elements, so according to the corresponding interface of its behavior classification, the heads-up display (HUD) can be used. The HUD is mainly for players to see, so it is an interface superimposed on the game screen (Galloway, 2006).