Interaction Techniques for Finger Sensing Input Devices
Total Page:16
File Type:pdf, Size:1020Kb
PreSense: Interaction Techniques for Finger Sensing Input Devices ££ £ £ Jun Rekimoto £ Takaaki Ishizawa Carsten Schwesig Haruo Oba £ Interaction Laboratory ££ Graduate School of Media and Governance Sony Computer Science Laboratories, Inc. Keio University 3-14-13 Higashigotanda Shonan Fujisawa Campus Shinagawa-ku, Tokyo 141-0022 Japan 5322 Endo, Fujisawa 252-8520 Japan Phone: +81 3 5448 4380, Fax +81 3 5448 4273 [email protected] rekimoto,oba @csl.sony.co.jp http://www.csl.sony.co.jp/person/rekimoto.html ABSTRACT Although graphical user interfaces started as imitations of the physical world, many interaction techniques have since been invented that are not available in the real world. This paper focuses on one of these “previewing”, and how a sen- sory enhanced input device called “PreSense Keypad” can provide a preview for users before they actually execute the commands. Preview important in the real world because it is often not possible to undo an action. This previewable fea- Figure 1: PreSense Keypad senses finger contact ture helps users to see what will occur next. It is also helpful (proximity) and position before the key press. when the command assignment of the keypad dynamically changes, such as for universal commanders. We present sev- eral interaction techniques based on this input device, includ- touch sensor, finger motions on the keypad are recognized ing menu and map browsing systems and a text input system. by integrating each keytop’s information. We also discuss finger gesture recognition for the PreSense This sensor and keytop combination makes it possible to rec- Keypad. ognize “which button is about to be pressed”, and the system KEYWORDS: input devices, previewable user interfaces, can provide appropriate information to users. This feature gesture sensing becomes effective when users are uncertain about what will occur as a result of the key press (i.e., command execution). INTRODUCTION We call it the “previewable user interface” because users can As mobile devices become more common, designing effec- see the effect of the command in advance. Users can quickly tive interaction techniques for these devices becomes more browse multiple preview information by sliding their fingers important. Unlike desktop computers, input devices for mo- (or thumbs) over the keypad. bile computers are often restricted by the limited size, and The other potential benefit of the PreSense input device is thus multiple functions must be assigned to the same but- gesture recognition. By estimating finger motion, or calcu- tons. This factor makes user interfaces complicated and un- lating the number of finger contacts, it is possible to recog- predictable. nize several command inputs other than normal typing. This paper proposes a new input device for mobile devices, This paper presents various kinds of interaction techniques called PreSense. PreSense is a keypad that is enhanced by using the PreSense Keypad. We first discuss the “preview” touch (or proximity) sensors based on capacitive sensing user interface features and then look at several concrete ex- (Figure 1). Since each keytop behaves as an independent amples, followed by a discussion on the sensor architecture. RELATED WORK Our work in this paper is an extension of our previous Smart- Pad system [11]. This paper further enhances the concept and realizes several application examples. It also introduces different sensor designs. Hinckley et al. proposed “touch-sensing user interfaces” [4] and developed touch-sensing input devices such as the touch- mouse. Our PreSense keypad is also a kind of touch-sensing input device and it also senses finger motion on the device’s surface by using multiple touch sensors instead of using a single sensor that covers the entire device. ThumbSense also senses finger contact on the touch-pad and uses this infor- Figure 2: Preview information example: a tool tip auto- mation to automatically change operation modes [10]. For matically pops up when the cursor stays over an icon. example, when a user touches the touch-pad, some of the keyboard keys act as mouse buttons so the user can smoothly shift text input and touch-pad operation (mouse) modes with- out removing their hands from the keyboard’s home position. The layered touch panel [16] is a touch-panel that is en- hanced by an infrared-grid sensor. It can be used as a previewable input device because it can detect finger posi- tion before the finger makes contact with the touch panel. touch press The pointing-keyboard [17] integrates a keyboard and an State 1 State 2 State 3 infrared-grid sensor. Our sensor is based-on capacitive sens- out of range (touch) (pressed) ing and the entire sensor can be embedded within a normal lift release keypad without requiring projected sensors. This feature is suitable for integration with mobile devices. Behind touch [5] is an input method for mobile devices that move uses a keypad installed on the back of the device. Users ma- nipulate it with their fingers and their finger position appears on the front screen. Finally, our work was also inspired by other sensory en- hanced mobile device research [8, 2, 3, 12]. While these de- Figure 3: Three input states of PreSense: the “touch” vices use additional sensors as new (separated) input modes, state can be used to provide preview information be- we are more interested in enhancing existing input devices, fore command execution. Finger movement without such as a keypad, with sensors. pressing the buttons can be treated as gesture com- mands. PREVIEWABLE USER INTERFACES As Shneiderman clarified [14], the essence of direct manip- information automatically disappears after the cursor goes ulation user interfaces can be summarized as the following away. three features: Although (reactive) feedback and preview are supplemental ¯ Continuous representation of the object of interest. concepts and can be used in combination, “preview” has sev- ¯ Physical actions or labeled button presses instead of com- eral features that are not available with normal feedback, as plex syntax. demonstrated in this simple tooltip example. These features ¯ Rapid incremental reversible operations whose impact on can be summarized as the following: the object of interest is immediately visible. The third of these defines the user interface feedback. More Transparent operations: In many cases, interacting with precisely, there are two types of feedback. One is where the preview information, such as tooltips, is transparent. It does information is delivered as a result of command execution. not force users to change operation. Experienced users can Sallen et al. calle this type reactive feedback [13]. The other simply bypass previews and directly perform commands, type is feedback information that is given prior to command while the same system can guide novice users. execution. This is either called proactive feedback [13] or more popularly, preview. [15] Rapid inspection of possibilities: When users do not under- stand the command results well, one possible solution is to A simple example of preview information is a command execute one of them and undo it if the result does not match tooltip (Figure 2). When the cursor stops at one of the tool their expectations (and then try another). However, this ap- button icons, more detailed information than simple the but- proach is often ineffective because of the cognitive overload ton label pops-up on the screen. Users know which button of repeating “do-undo” operations, and also because not the must be pressed before executing all the buttons. The Tooltip all commands are undo-able. Sensor modality Visual Auditory Kinesthetic Timing of delivery *1 Reactive Proactive (previewable) Duration *2 Transient Sustained touch Monitoring *3 Demanding Avoidable Mode onwer *4 User-maintained System-maintained *1: Does feedback occur only when an action is executed? feedback styles *2: is the feedback sustained throughout a particular mode? with PreSense *3: Can the user choose not to monitor the feedback? *4: Does the user actively maintain the mode? Table 1: Feedback styles (based on Sellen et al.) and PreSense. The preview information assists users in selecting the appro- touch priate command before actually executing it. It can be either real (e.g., the result of a font style change can be previewed in a popup window) or modeled (e.g., showing a print image before sending a document to the printer). This “undo-less” feature is particularly beneficial when users’ tasks affect real world situations (e.g., printing out a docu- ment), because the results of the executions are often difficult or impossible to cancel. Efficient use of screen space: Techniques such as tool-tips can be regarded as an effective way of using (limited) screen press space because guide information is selectively presented ac- cording to the user’s current focus. If all the information appears on the screen, it would become too complicated and would also consume a large amount of screen area. When users are dealing with mobile devices, the “context sensitive information presentation” strategy would be very important because the screen size is more restrictive than in desktop computing. Figure 4: Map navigation with PreSense Keypad. The Previewable Interfaces with Physical Keypads area corresponding to the user’s thumb position be- For these reasons, the “preview” techniques effectively help comes slightly larger than other areas, and the prop- users in various GUI applications [15]. However, when deal- erty information also becomes visible. Pressing the key zooms-up the corresponding area. ing with physical input devices, such as a keypad of a cellular phone, “preview” was not effectively provided. When a user first puts his/her finger on the keypad, preview One reason for this is that while the mouse (and electro information is provided on the screen. The information can magnetically traced stylus devices, such as those made by be modified according to the finger position on the keypad.