
WatchOut: Extending Interactions on a Smartwatch with Inertial Sensing Cheng Zhang Junrui Yang Caleb Southern Georgia Institute of Peking University Georgia Institute of Technology [email protected] Technology [email protected] [email protected] Thad E. Starner Gregory D. Abowd Georgia Institute of Georgia Institute of Technology Technology [email protected] [email protected] ABSTRACT Current interactions on a smartwatch are generally limited to a tiny touchscreen, physical buttons or knobs, and speech. We present WatchOut, a suite of interaction techniques that includes three families of tap and swipe gestures which ex- tend input modalities to the watch’s case, bezel, and band. We describe the implementation of a user-independent ges- ture recognition pipeline based on data from the watch’s em- (a) SideTap (b) BezelButtons bedded inertial sensors. In a study with 12 participants using (c) BandSwipe both a round- and square-screen watch, the average gesture classification accuracies ranged from 88.7% to 99.4%. We Figure 1: WatchOut Interaction Families demonstrate applications of this richer interaction capability, and discuss the strengths, limitations, and future potential for this work. band of a smartwatch. WatchOut offers the following re- search contributions : Author Keywords • The design and implementation of three gesture families Smartwatch; mobile interactions; inertial sensing; machine that extend smartwatch interactions to the side, bezel, and learning; band of the watch. ACM Classification Keywords • An evaluation of the interaction performance with user- H.5.m. Information Interfaces and Presentation : User inter- independent machine learning models in a laboratory en- faces; Input devices and strategies vironment. INTRODUCTION • A demonstration of applications of WatchOut, and the The richness of touchscreen interactions on a smartphone practical challenges to improving and deploying the cannot be easily replicated on a smartwatch. The relatively WatchOut interactions in real-world scenarios. tiny screen exaggerates issues that already existed with smart- phone interactions, such as the fat finger problem and occlu- RELATED WORK sion [21]. There is even greater motivation to explore inter- action techniques away from the touchscreen for these wrist- Novel interactions on mobile phones mounted devices. Smartphones and wearable devices share many common chal- lenges of mobile interaction design. Since Hinckley et al. first We present WatchOut, using inertial sensors to provide a va- demonstrated the possibility of extending interactions on the riety of tapping and sliding gestures on the side, bezel, and phone in 2001 [8], much research has explored novel mobile interaction techniques by designing novel gestures on touch- screen [6, 15] or beyond [23, 12,2] the touchscreen. Novel interactions to support wearable devices Compared with smartphones, the input for wearable devices is even more limited. To improve the input experience on This is the author version of the paper wearable devices, researchers have proposed various novel Reserved, Copyright © input techniques. These involve the user wearing additional devices on the hands or wrist and arms [19,4,5] to detect 2 hand gestures. Gyro X Gyro Y Novel interactions on smartwatches 0 As the smartwatch has become more popular, HCI re- Sensor Reading searchers have explored how to improve the user experience −2 Tap Left Tap Right Moving arm 0 500 in spite of the inherent limits of the small touchscreen and Sensor Sample form factor. Approaches include increasing the size of the (a) The coordinate system on the watch (b) Tap on side of the watch screen area [14], reducing the size of the touch area [21], or by applying carefully designed touchscreen gestures [16, 22]. Gyro X Gyro X 5 Gyro Y 5 Gyro Z Another approach is to make other parts of the watch interac- 0 tive [1,3], including the band[18], or an additional arm band 0 −5 [10]. Others have considered extending the interaction area to Sensor Reading Sensor Reading −5 N E S W Swipe right Swipe left a larger space around the device; for example, recognizing 3D −10 0 200 400 0Swipe up 200Swipe down 400 gestures in the space above [11], around [13, 17] the watch or Sensor Sample Sensor Sample expand the perception space by using dynamic peephole[9]. (c) Tap on watch bezel (d) Swipe on watchband Most of these solutions require either some additional sens- Figure 2: Sensor data collected when various gestures are per- ing infrastructure on the device or a completely new device formed to be worn by the user. Research [15, 23, 24] have shown the possibility of detecting the taps on the side and back of a phone with built-in sensors. However, the shape, the worn watch has more freedom to move along the x-axis due to the position, and the size of a watch is different from a smart- way a watch is worn. phone. Therefore, how to design the gesture families and situ- ated them into the watch applications are the new challenges. Swipes on the watchband can also be detected through the gy- Compared with recent work [20] that recognizes the finger roscope data (Figure 2d). Swiping up and down will tilt the gestures by using built-in sensors on a smartwatch, WatchOut watch positively and negatively along the y-axis, and swip- will demonstrate a comparable variety of input gestures on ing left and right will tilt the watch positively and negatively the watch case, with the advantage of using only existing in- along the z-axis. These four gestures are visually apparent ertial sensing common on smartwatches today. when comparing the strength of the signal for each axis and and the polarity of the peak. GESTURE DESIGN AND DETECTION Smartwatches today commonly include two inertial sensors, Gesture Families an accelerometer and a gyroscope. We describe the response Based on our observations of the raw sensor data from various of these sensors to various physical stimuli and then define tap and swipe gestures around the case and band of a watch, three families of interaction gestures based on these stimuli. we designed three families of gestures: SideTap; BezelBut- Finally, we describe the data processing pipeline for gesture tons; and BandSwipe (see Figure1). recognition. SideTap The user can tap on either the left or the right side of the Theory of Operation watch case. These interactions can be performed eyes-free, In figure2, we show the raw gyroscope data generated by and are appropriate for performing simple acknowledge or tapping on the top of the watch bezel, the side of the watch dismiss actions, such as rejecting a phone call. bezel (what we call the case), and swiping on the watchband. We also observed similar patterns in the accelerometer data. BezelButtons In figure 2b, we can see that a tap on the left will generate a A user can tap the bezel area around the outside of the screen, positive spike along the x-axis of the gyroscope data, while a as Figure 1b shows. We can recognize up to eight tap lo- tap on the right will generate a negative spike. The lower fre- cations, whose positions are equally distributed around the quency and lower intensity sensor data for an arm movement watch case. Intuitively, these eight tap gestures can be used is also visually distinct from the harsh and high-frequency for navigating directionally, as with a D-pad. Potentially, data from a tap gesture. BezelButtons can also help facilitate richer menus on the watch. For instance, most watch applications can only dis- We also observed similarly clear data from taps on the top play a limited number of menu choices (usually around three face of the watch bezel. Figure 2c shows four taps performed items) due to the limited screen real estate. on each side of the watch bezel face (North, East, South, and West). Taps in the North and South will have relative larger BandSwipe x-axis gyroscope readings, and taps in the East and West will We can turn the band of a watch into an interactive surface have a relative larger y-axis readings. Interestingly, the read- by recognizing four different directions (up, down, left, right) ing of the y-axis gyroscope generally appear to be smaller of a sliding gesture, as Figure 1c shows. These four gestures than the readings on the other two axes. This is because the can be naturally used in applications that require directional controls. The BandSwipe gestures can also be used in com- The features are used to train two support vector machines bination with the touchscreen to enhance interaction. (SVM) for gesture classification. The first classifier distin- guishes the noise instances from the gesture instances. If a gesture is detected, the second classifier then identifies which Data Processing gesture is being performed and we move to the next window To recognize a gesture, we first segment the sensor data for without overlap. event detection, and then classify the gesture with pre-built machine learning models. We use the sequential minimal op- Choice of Hardware timization (SMO) implementation of support vector machine We implemented our interaction techniques on two Android (SVM) provided by Weka[7] to build the models, which are Wear smartwatches: 1) the LG G Watch Urbane; and 2) the also used for real-time event detection and classification in Sony Smartwatch 3. We chose these two watches because our interactive prototype. they exhibit different physical characteristics. The LG Watch To recognize the gestures, we built two SVM models. The has a round screen and a leather watchband. The Sony watch first model identifies sensor data as one of two classes: ges- has a square screen and a rubber watchband. We offload most ture or noise. If the data is classified as a gesture event, it is computation work to a paired Google Nexus 6 smartphone, passed to the second model for classifying which gesture it is.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-