Novel system mimics focus activity of the human eye

Novel system mimics focus activity of the human eye
Shown: Range of focus for various eye refractive conditions. Myopia shifts the focal range closer, causing bad far sight. Hyperopia shifts the focal range farther (allowing focusing ability beyond infinity), causing bad near sight and lens fatigue since the crystalline lens does not relax even when looking at infinity. Presbyopia reduces the focal range and moves the nearest plane of focus away from the eye. Credit: Nitish Padmanaban/Stanford

With aging comes deteriorating vision. At SIGGRAPH 2018, attendees will have the chance to test a new computational system that effectively mimics the natural way the human eye corrects focus, specifically while viewing objects that are closer rather than farther away.

This particular inability to focus clearly on nearby objects is called presbyopia, and everyone could have a varying degree of this problem as they age. A team of Stanford University researchers has developed a system that automatically corrects this mechanism. Dubbed Autofocals, the system externally mimics the natural accommodation response of the eye by combining data from eye trackers and a depth sensor, and then automatically drives focus-tunable lenses. While there are a variety of vision correction options to address this problem, most solutions, to date, fall short of delivering users the natural vision quality of their youth.

"A lot of presbyopes have had time to get used to their corrections, through progressive lenses, monovision, etc., but they still spent the majority of their lives being able to refocus their eyes," says Nitish Padmanaban, lead author of the study and electrical engineering Ph.D. candidate at Stanford. "We want to restore that experience."

Padmanaban and collaborators, including Robert Konrad, Ph.D. candidate at Stanford, and Gordon Wetzstein, assistant professor of electrical engineering and of computer science at Stanford, will demonstrate Autofocals at SIGGRAPH 2018, held 12-16 August in Vancouver, British Columbia. The annual conference and exhibition showcases the world's leading professionals, academics, and creative minds at the forefront of computer graphics and interactive techniques.

Building on previous advances made in the area of automated vision correction, Autofocals' focus-tunable eyewear for presbyopia correction incorporates eye trackers and a depth camera with a sensor fusion algorithm designed to jointly and effectively make use of both in one complete system.

The team evaluated Autofocals on 24 users, ages 51 to 81, across a set of visual performance tasks: visual acuity (sharpness of eyesight), contrast sensitivity, and letter matching. In this introductory study, users experienced better visual acuity at nearer distances when compared to monovision and progressive lenses methods, while not sacrificing 20/20 visual acuity at any distance. Monovision is the use of one contact lens that corrects only distance vision in one eye, and another lens that corrects only near vision in the other eye. Progressive lenses are popular eyewear lenses that offer focus correction at varying strengths and distances. Preliminary results also indicated that users did not experience a sizable difference with respect to contrast sensitivity and letter matching, and the majority felt the prototype worked more effectively than their own corrective lenses.

"While the technology is still in its early stages, several of the presbyopes who've tried our system genuinely wanted to have a working version for themselves," notes Padmanaban. "Vision is such an important sense for all of us that every improvement counts for a lot in terms of quality of life."

At the SIGGRAPH demo, those who suffer from presbyopia will get started by calibrating the eye trackers on the Autofocals device, after which they should be able to simply look around and have different objects automatically pop into focus. For younger attendees, a separate focus-tunable lens will be made available to them so that they can experience firsthand some sense of the problem that Autofocals is attempting to address.

In future work, the researchers might explore improving the eye-tracking component; specifically, digging deeper into developing a calibration-free or calibration-on-the-fly eye-tracking capability. Current user experience now entails having to calibrate the eye trackers each time the device is worn or when they slip off the wearer's face, for example. Other improvements could be made to the device itself, with respect to design aesthetics and practical use.

"One could imagine that as this technology improves," adds Padmanaban, "you could have a single pair of glasses for your entire life."

Explore further: Researchers personalize virtual reality displays to match a user's eyesight