Since January 2014, a few dozen people in the United States had partially restored to them something they thought they had lost forever: the power of sight. Given a revolutionary retinal prosthesis system, these blind individuals could now see a doorway across a room or the shadowy outline of a loved one.
Their “sight” was returned via a cutting-edge prosthesis system called Argus II, which includes not just a surgical implant in the eye but also wraparound glasses fitted with a tiny video camera that “sees” the outside world. A holster unit worn by the patient interprets the video and sends signals back to the retinal implant. The image is not pristine. The 60-pixel, grayscale images are akin to watching a black and white TV, with the action obscured by screen snow. And while the system is life-changing, it’s also somewhat cumbersome and the visual quality is not consistent from patient to patient.
Now, the Johns Hopkins University Applied Physics Laboratory is working on a next generation retinal prosthesis system, with the help of a $4 million grant from the Mann Fund, and in collaboration with Second Sight Medical Products, a Sylmar, California–based company that develops, manufactures, and markets the Argus II for people who are blind. APL will develop glasses that use real-time computer vision and eye-tracking sensors and will allow the wearer to better identify obstacles like doorways, hallways, and household objects. The information will be distilled into a format that can be projected into the retinal prosthesis, bypassing the damaged rods and cones in the retina. While not approaching 20/20 vision, the system could allow a grandfather to recognize the face of a granddaughter across the room, for example. “With the current system there’s a camera situated on the head and you have to move your head, rather than just your eyes, to scan a scene,” says Scott Dunbar of Second Sight. “But APL’s work with eye tracking will allow a more natural scanning of the scene. That’s something people have been working on since the 1980s, but nobody has got it to work quite right.”
The new retinal prosthesis will overlap with the vision system in Harmonie, APL’s existing prosthetic limb control prototype. The Hybrid Augmented Reality Multimodal Operation Neural Integration Environment uses off-the-shelf components such as Microsoft Kinect (the motion-sensing input for the Xbox) to detect a graspable object in the scene, and then communicates with the prosthetic arm to move it. “Our approach is to leverage this work to allow blind people to better perform basic visual navigations and tasks like following a crosswalk, finding a doorway, and reading using 3-D sensing technologies,” says APL project manager Kapil Katyal.
The ultimate goal is to create a more advanced video data system housed in a smaller, more aesthetically pleasing pair of glasses. Katyal says that the developers hope to have hardware ready for market in three and a half years.
Carlo Giambarresi; Portrait by Caroline Andrieu
“APL’s work with eye tracking will allow a more natural scanning of the scene. That’s something people have been working on since the 1980s, but nobody has got it to work quite right,” says Scott Dunbar.