Abstract
In emerging VR/AR applications, the primary interface between the user and the digital world is the near-eye display. However, today’s VR/AR systems struggle to provide natural and comfortable viewing experiences, partly due to their fixed focal plane designs. In this paper, we discuss gaze-contingent varifocal display designs for next-generation computational near-eye displays. Moreover, we discuss how the same technology components, eye tracking and focus-tunable optics, can help presbyopes see the real world better. As humans age, they gradually lose the ability to accommodate, or refocus, to near distances because of the stiffening of the crystalline lens. This condition, known as presbyopia, affects nearly 20% of people worldwide. Here, we discuss the design and implementation of a new presbyopia correction, autofocals, to externally mimic the natural accommodation response, combining eye tracker and depth sensor data to automatically drive focus-tunable lenses. With extensive user studies, we demonstrate that autofocals improve visual acuity over conventional monovision eyeglasses and task performance over progressives.
Original language | English (US) |
---|---|
Title of host publication | SID Symposium Digest of Technical Papers |
Publisher | Wiley |
Pages | 41-44 |
Number of pages | 4 |
DOIs | |
State | Published - Sep 25 2020 |
Externally published | Yes |
Bibliographical note
KAUST Repository Item: Exported on 2022-06-30Acknowledgements: G.W. was supported by an NSF CAREER Award (IIS 1553333), a Sloan Fellowship, by the KAUST Office of Sponsored Research through the Visual Computing Center CCF grant, and a PECASE by the U.S. Army Research Office.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.