Accommodation-invariant Computational Near-eye Displays
ROBERT KONRAD, Stanford University NITISH PADMANABAN, Stanford University KEENAN MOLNER, Stanford University EMILY A. COOPER, Dartmouth College GORDON WETZSTEIN, Stanford University
Fig. 1. Conventional near-eye displays present a user with images that are perceived to be in focus at only one distance (top left, 1 m). If the eye accommodates to a different distance, the image is blurred (top left, 0.3mand ∞). A point spread function (PSF) illustrates the blur introduced for a single point of light at each distance (insets). The fact that conventional near-eye displays have a single sharp focus distance can be problematic, because it produces focus cues that are inconsistent with a natural 3D environment. We propose a computational display system that uses PSF engineering to create a visual stimulus that does not change with the eye’s accommodation distance (bottom left). This accommodation-invariant display mode tailors depth-invariant PSFs to near-eye display applications, allowing the eye to accommodate to arbitrary distances without changes in image sharpness. To assess the proposed display mode, we build a benchtop prototype near-eye display that allows for stereoscopic image presentation (right). An autorefractor is integrated into the prototype to validate the accommodation-invariant display principle with human subjects.
Although emerging virtual and augmented reality (VR/AR) systems can be measured while they view visual stimuli using multiple different display produce highly immersive experiences, they can also cause visual discomfort, modes. eyestrain, and nausea. One of the sources of these symptoms is a mismatch CCS Concepts: • Computing methodologies → Perception; Virtual re- between vergence and focus cues. In current VR/AR near-eye displays, a ality; Mixed / augmented reality; Image processing; stereoscopic image pair drives the vergence state of the human visual system to arbitrary distances, but the accommodation, or focus, state of the eyes is Additional Key Words and Phrases: vergence–accommodation conflict, com- optically driven towards a fixed distance. In this work, we introduce anew putational displays display technology, dubbed accommodation-invariant (AI) near-eye displays, ACM Reference format: to improve the consistency of depth cues in near-eye displays. Rather than Robert Konrad, Nitish Padmanaban, Keenan Molner, Emily A. Cooper, and Gor- producing correct focus cues, AI displays are optically engineered to produce don Wetzstein. 2017. Accommodation-invariant Computational Near-eye visual stimuli that are invariant to the accommodation state of the eye. Displays. 36, 4, Article 88 (July 2017), 12 pages. The accommodation system can then be driven by stereoscopic cues, and ACM Trans. Graph. https://doi.org/10.1145/3072959.3073594 the mismatch between vergence and accommodation state of the eyes is significantly reduced. We validate the principle of operation of AI displays using a prototype display that allows for the accommodation state of users to 1 INTRODUCTION AND MOTIVATION
© 2017 Copyright held by the owner/author(s). Publication rights licensed to Association Emerging virtual and augmented reality (VR/AR) systems offer un- for Computing Machinery. precedented user experiences. Applications of these systems include This is the author’s version of the work. It is posted here for your personal use. Not for entertainment, education, collaborative work, training, telesurgery, redistribution. The definitive Version of Record was published in ACM Transactions on Graphics, https://doi.org/10.1145/3072959.3073594. and basic vision research. In all of these applications, a near-eye display is the primary interface between the user and the digital
ACM Transactions on Graphics, Vol. 36, No. 4, Article 88. Publication date: July 2017. 88:2 • Konrad et. al.