To Aid The Blind, An Assist From Cameras
Note: this was originally published elsewhere in 2014.
In two labs some 50 miles apart in Israel, computer scientists and engineers are refining devices that employ tiny cameras as translators of sorts. For both teams, the goal is to give blind people a form of sight — or at least an experience analogous to sight.
At Bar-Ilan University near Tel Aviv, where Zeev Zalevsky is head of the electro-optics program, these efforts have taken shape in the form of a smart contact lens. The device begins with a camera mounted on a pair of glasses, and the contact lens, Dr. Zalevsky explained, is embedded with an electrode that will produce an image of what is before the camera directly on the cornea. The image would be experienced in one of two ways: If an apple is placed before the camera, it could be “seen” either as the contour of an apple or as a Braille-like shape that a trained user would recognize as a representation of an apple.
Yevgeny Beiderman, a graduate student who worked with Dr. Zalevsky in testing the prototype, said: “The first time, the usage of the glasses feels strange. It takes at least a few attempts to start using it.”
The image captured by Dr. Zalevsky’s device is 110 by 110 pixels — hardly photograph-quality resolution, but Dr. Zalevsky said by email that the camera captures several images in time, and the compressed and encoded result “is enough to allow functionality to the blind person (for example: Braille contains only six points and is enough for reading.)”
Dr. Zalevsky is awaiting permission from a hospital to test the electrode lens on people, so in the meantime he has conducted preliminary trials using lenses that apply air pressure to the cornea instead. He has also conducted tests in which participants identified various shapes based on electrical stimulation of the tongue, after the same sort of training that would let someone wearing his lens “see” an apple as a Braille-like pattern.
Because the electrodes are clear and the lenses are standard soft polymers, Dr. Zalevsky said, the lens should be virtually unnoticeable to the wearer. “Soft contact lenses you put in in the morning and take out in the evening, if all is O.K.,” he said. “I guess the same will be true here.”
Dr. Jason Moss, an ophthalmologist at SUNY Downstate Medical Center in Brooklyn who was not involved in the research, said he found the idea of a smart contact lens interesting, but expressed caution about the limitations of resolution and the effects of the lens rubbing on the eye. “The cornea is a very delicate structure that is very susceptible to injury,” he said.
Such concerns are certain to arise as technology-equipped contact lenses find a broader audience. Google recently filed a patent for a camera to be part of a smart contact lens — including a proposed feature in which absorbed data could indicate via a voice warning, for example, whether a crosswalk is safe to cross.
A separate technique meant to assist the blind, developed at the Hebrew University of Jerusalem, spares the cornea entirely, instead recruiting the ears as a substitute sense. The device, like Dr. Zalevsky’s, incorporates a camera mounted on a pair of glasses, but in this case the images are translated into sound in a free iPhone app.
Known as EyeMusic, the app is the work of Professor Amir Amedi, who at a TEDx Talk in 2012 demonstrated that by assigning sounds that correspond to specific images, people with visual impairment could “see” blue squares, lean over and grab shoes on the floor, identify individuals by the sound of their eyes and nose and eyebrows, and tell whether someone was smiling or frowning.
EyeMusic works by assigning distinctive sounds to certain visual properties, and like Dr. Zalevsky’s lens, it requires practice to use. At its simplest, if a dog walked through your field of vision — low to the ground, moving in a straight line — the app would create a low, steady sound. A bird flying in a straight line through your field of vision would create a high, steady sound. A baseball rising up into the air — creating a diagonal line — would create a rising sound.
But firsthand use of the app is slightly more complicated than that. The program bases the audio output on a kind of infrared approximation of what’s before the camera, and when the image is complicated, the output can sound like someone dragging both hands across an organ. Listen to it often enough, though — film someone walking down the street, for example — and a user begins to notice subtle shifts and variations, pockets of sound changing as the image changes.
“One user blind from birth used the EyeMusic to look at the sky and was astounded by the shape and presence of clouds,” Dr. Amedi said by email. “He thought the sky would be blue and didn’t understand initially what are the white things that disrupt the image. Then he learned these are clouds — that create rain — and loved it.”
Dr. Amedi said he is working on a new iteration of the app that will add games to teach geometry and math, and he is pursuing several other sensory substitution projects to benefit people with impaired vision.
Devices like the EyeMusic app and Dr. Zalevsky’s smart contact lens have the potential to open up possibilities for people like Tanja Milojevic, a graduate student at the University of Massachusetts-Boston who is studying to be a Braille teacher.
Ms. Milojevic, 24, who developed glaucoma at age 5, has 20/800 sight within a narrow field of vision. She has worked as a teaching assistant, and said in a telephone interview that among those she has worked with, “the biggest reaction would be from students losing their vision quickly or are in the process of losing it, because they’re so used to the visual world.”
“With people losing their vision,” she continued, “they want to be in the sighted world as much as they can for as long as they can.”
Cool stuff!