Ultra-Low Power Gaze Tracking for Virtual Reality

Ultra-Low Power Gaze Tracking for Virtual Reality

Ultra-Low Power Gaze Tracking for Virtual Reality Tianxing Li, Qiang Liu, and Xia Zhou Department of Computer Science, Dartmouth College, Hanover, NH {tianxing,qliu,xia}@cs.dartmouth.edu ABSTRACT Tracking user’s eye xation direction is crucial to virtual reality (VR): it eases user’s interaction with the virtual scene and enables intelligent rendering to improve user’s visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power- hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze track- ing tailored to VR. It relies on a few low-cost photodiodes, eliminat- ing the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reected screen light in dierent directions. Figure 1: LiGaze integrated into a FOVE VR headset. LiGaze uses It then infers gaze direction by exploiting pupil’s light absorption only passive photodiodes on a light-sensing unit to track user gaze. property. The core of LiGaze is to deal with screen light dynamics It can be powered by a credit-card sized solar cell (atop the headset) and extract changes in reected light related to pupil movement. harvesting energy from indoor lighting. LiGaze infers a 3D gaze vector on the y using a lightweight regres- VR headsets provide immersive, realistic simulation of the 3D phys- sion algorithm. We design and fabricate a LiGaze prototype using ical world and are poised to transform how we interact, entertain, o-the-shelf photodiodes. Our comparison to a commercial VR eye and learn. With the advent of aordable, accessible VR headsets (e.g., ◦ ◦ tracker (FOVE) shows that LiGaze achieves 6.3 and 10.1 mean Google Daydream, Cardboard, Samsung Gear VR), VR is gaining within-user and cross-user accuracy. Its sensing and computation popularity and projected to be a multi-billion market by 2025 [2]. consume 791µW in total and thus can be completely powered by a Our work in this paper focuses on a feature crucial to VR: gaze credit-card sized solar cell harvesting energy from indoor lighting. tracking, i.e., determining user’s eye xation direction. Not only LiGaze’s simplicity and ultra-low power make it applicable in a does gaze tracking allow users to interact with the content just by wide range of VR headsets to better unleash VR’s potential. glances, it also can greatly improve user’s visual experience, reduce VR sickness, and save systems (display) energy. The energy saving CCS CONCEPTS can be achieved by foveated rendering [23, 52], which progressively • Human-centered computing → Ubiquitous and mobile de- reduces image details outside the eye xation region. Such energy vices; Virtual reality; • Computer systems organization → Sen- saving is particularly benecial for mobile VR headsets without sors and actuators; external power cords. The dominating methodology to gaze tracking relies on (infrared) KEYWORDS cameras, often accompanied by infrared (IR) LEDs to illuminate Gaze tracking, virtual reality, visible light sensing eyes. The combination of IR LEDs and cameras raises concern on energy consumption and form factor. Recent hardware advances ACM Reference format: have led to modular designs of eye tracker that can be included to Tianxing Li, Qiang Liu, and Xia Zhou. 2017. Ultra-Low Power Gaze Tracking mobile and wearable devices. These commercial eye trackers (e.g., for Virtual Reality. In Proceedings of SenSys ’17, Delft, Netherlands, November 6–8, 2017, 14 pages. Tobii, SMI), however, are still expensive (e.g., $10K for integrating a DOI: 10.1145/3131672.3131682 Tobii eye tracker into a VR headset) and power-hungry (requiring external battery packs). As a result, most VR headsets today shy away from integrating gaze tracking and resort to head direction 1 INTRODUCTION as a coarse, often incorrect, estimate of user’s gaze. Virtual reality (VR) emerges as a promising next computing plat- In this paper, we seek a low-power, low-cost approach that can form. Taking the form of head-mounted displays (HMD), modern lower the barrier to integrate gaze tracking into various VR headsets Permission to make digital or hard copies of all or part of this work for personal or including those for mobile VR. We consider a minimalist sensing classroom use is granted without fee provided that copies are not made or distributed approach, which eliminates active IR LEDs and replaces cameras for prot or commercial advantage and that copies bear this notice and the full citation with a few low-cost ($2 each), small (e.g., 2 mm2) photodiodes. on the rst page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, Exploiting the fact that VR screen light is the sole and constant to post on servers or to redistribute to lists, requires prior specic permission and/or a light source within the space of a VR headset, we reuse screen light fee. Request permissions from [email protected]. to track user gaze. Specically, we place a few (e.g., 16) photodiodes SenSys ’17, Delft, Netherlands © 2017 ACM. 978-1-4503-5459-2/17/11...$15.00 at the boundary of each VR lens (Figure 1), so that each photodiode DOI: 10.1145/3131672.3131682 senses screen light reected by user’s eyeball in a certain direction. SenSys ’17, November 6–8, 2017, Del, Netherlands Tianxing Li, Qiang Liu, and Xia Zhou Given that our pupil is a hole in the iris center and absorbs incoming the tradeo between performance and energy consumption of gaze light, when screen light strikes the pupil region, most light rays tracking in the context of wearable devices. Our work follows the are absorbed from the photodiode’s point of view, weakening the spirit of recent works [40, 41] and yet diers in that we focus on reected light perceived by photodiodes in that direction. Thus, the VR scenario and exploit the VR setup to further drive down pupil movement aects the spatial pattern of changes in reected the energy consumption. The simplicity and ultra-low power of screen light. Such pattern can be exploited to infer gaze direction. our approach can help bring gaze tracking to a wide range of VR Although pupil’s light absorption has been exploited by prior headsets and better unleash VR’s potential. methods [19, 42], we face unique challenges in VR, mainly because of the uncontrolled nature of our light source (screen light). VR con- 2 LIGAZE: RATIONALE AND CHALLENGES tent is colorful and dynamic, resulting in screen light and hence re- Tailored to wearable VR, LiGaze reuses screen light inside a VR ected screen light that are spatially nonuniform and time-varying. headset and senses screen light reected by our eyes. Next we Such inherent variations make it hard to identify changes in re- rst describe the inner structure of modern VR headsets. We then ected light caused by pupil movement. Furthermore, user diversity introduce LiGaze’s design rationale and challenges. presents additional challenges in inferring gaze. Our experiments show that eye and skin characteristics all aect the actual impact of 2.1 Inner Structure of VR Headsets pupil’s light absorption on sensed reected light. The gaze tracking algorithm needs to be customized for individual users. Finally, blink Modern VR headsets are classied into two types based on the can interfere with sensing reected light and tracking gaze. head-mounted displays (HMDs): 1) tethered HMDs (e.g., HTC Vive, To address these challenges, we design LiGaze. The core of LiG- Oculus Rift), displays built in the headset and connected to powerful aze is to estimate reected light changes associated with screen desktop servers for rendering, and 2) mobile HMDs, which reuse content and then identify changes caused by pupil movement. To screens of mobile phones slotted into the VR headset (e.g., Google do so, we design a circuit board with photodiodes embedded back Cardboard, Daydream, Samsung Gear VR). Tethered HMDs oer to back on both sides, where one side faces the screen sensing better visual quality, thanks to the computation power of external incoming screen light in dierent directions and the other faces servers allowing more advanced rendering. However, they are con- user’s eyeball sensing reected screen light. We characterize the strained in mobility due to the need of tethered cords. Mobile VR, relationship between the front and back sensors’ readings assuming on the other hand, is self-contained oering full mobility, and yet a center pupil. This characterization is then used to derive features suers from relatively lower visual quality and limited battery life. related to changes of reected light caused by pupil movement. We Despite the dierences in HMDs, VR headsets share a similar apply supervised learning (boosted trees [12, 21]) to learn the corre- inner structure: with a display/screen in the front, screen light lation between our features and the 3D gaze vector, and infer gaze passes through a pair of lenses (36 – 43 mm in diameter) positioned vectors on the y using regression. To handle user diversity, we cal- very closely (1 – 2 cm) to eyes. The lenses divide the screen content ibrate our characterization and apply bootstrap aggregating [10] to into two slightly dierent 2D images tailored to the left and right ne-tune the trained model for a specic user. Furthermore, LiGaze eye. By angling the 2D images, the pair of lenses helps to create a detects blinks in a parallel process. Given that a blink causes similar 3D virtual scene perceived by the user.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    14 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us