Apps are linking visually impaired people to sighted volunteers as assistive technology enters a new era of connectivity
“Connected to other part,” my iPhone says to me as I stand somewhere in London’s Soho, trying to decipher the letter on the top of a bus stop.
“Hello?” says an American woman, reminding me of Scarlett Johansson’s disembodied artificially intelligent character from the sci-fi film Her.
“Hey, er … can you give me a hand by reading the letter on the bus stop?” I ask.
“Sure … can you move your phone a bit more up, and to the left … Ya! It says … F.”
Result. I thank her, end the session, pull up Citymapper and navigate my way onto the 453 going to New Cross.
I have a little bit of vision, but only enough to see motion and movement.
I am using an app called Be My Eyes, an app that connects blind and visually impaired people to sighted volunteers via a remote video connection. Through the phone’s camera, the blind person is able to show the sighted individual what they are looking at in the real world, allowing the volunteer to assist them with any of their vision-related problems.
I began to lose my sight in the summer of 2013 to a rare genetic mitochondrial disease called Leber’s hereditary optic neuropathy and was soon registered blind. I consequently found myself relying on an assortment of assistive technologies to do the simplest of tasks.
Be My Eyes has just over 35,000 visually-impaired users registered for the app and over half a million volunteers. Whenever a visually impaired user requests assistance a sighted volunteer receives a notification and a video connection is established.
Its benefits are obvious. Jose Ranola, a 55-year-old from the Philippines who works in construction and has retinitis pigmentosa, said: “I use it to help me identify medicine and read printed materials and also to describe places and objects.” He adds: “All my experiences were good. The volunteers were very helpful.”
James Frank, a 49-year-old counsellor in Minnesota, US, who has severely damaged optic nerves, is also a fan. “The response has been favourable and the volunteers are always polite,” he says. “The longest I have waited is maybe a minute.”
Brenda Smith, 51, who lives in Brisbane, Australia, has the same condition as I do. She says she uses Be My Eyes for day-to-day tasks like reading instructions on food and telling apart the white bread her son eats from the brown bread she does. She says she also used it recently to guide her to which switch had thrown in the electricity box.
In the UK there are over 2 million who have some form of sight loss and an estimated 285 million people registered blind or visually impaired worldwide. Technology has long been playing a roles in improving their lives. In the mid-1970s Ray Kurzweil, a pioneer in optical character recognition (OCR) – software that can recognise printed text – founded Kurzweil Computer Products and programmed omni-font, the first OCR program with the ability to recognise any kind of print style. He went on to make the Kurzweil Reading Machine, the first ever print-to-speech reading machine for the use of the blind.
Now, there’s a new booming age in the field of accessibility, driven in part by smartphones and high-speed connectivity. Screen readers have developed to such an extent that braille is no longer necessarily taught to people who lose their sight later in life.
All the time, companies are finding new ways to improve accessibility and Be My Eyes isn’t the only assistive technology company taking advantage of the real time human element, building technology that is based on the creation of dialogue with its users.
In May, startup Aira, the first product out of AT&T’s Foundry for Connected Health raised $12m in funding. Aira’s platform takes advantage of pre-existing wearable smart glasses, like Google Glass, and uses the mounted camera. But where Be My Eyes and Aira differ is that Aira incorporates remote human agents using the gig economy and has plans for artificial intelligence assistance. This allows it to connect trained, paid, independent contractors with blind people to assist them in day-to-day tasks in real time. The glasses stream everything the user is seeing to an agent who, sitting in front of a dashboard, is able to assist the user with everything from reading signs to shopping, to navigating, to the numerous other mundane tasks that sighted individuals take for granted. Through the glasses, the agent is able to talk to the user and give them detailed information about their surroundings. There is a hope that through machine-learning, the agents will be able to teach and AI how to command users to perform certain tasks. Aira has the backing from venture capital firms like Jazz Venture Partners and Lux Capital. As yet it is currently only available in the United States.
Earlier this year, Aira helped Erich Manser, who has retinitis pigmentosa, run the Boston marathon. Through the glasses, Aira’s agent, Jessica, was able to give him all the information that he needed regarding his surroundings. The two had been working together since Jessica first became an Aira agent the previous summer. By developing code words and short commands, Jessica, with the assistance of a sighted guide, was able to direct Erich past any obstacles, onto specific routes and onto the finish line to pass it safely. This was Erich Manser’s eighth Boston marathon, but his first with the assistive technology.
It’s not just in linking sighted people with visually impaired users that technology is able to help. The Sunu band, partially funded through Indiegogo, is trying to help improve people’s ability to perceive their surroundings. Based in Boston and Mexico, Sunu is a technology start-up creating a bracelet that uses ultrasonic sonar technology to detect the user’s surroundings and send haptic feedback whenever an obstacle comes into proximity. The ultrasonic waves emitted from the band’s transducer bounce off obstacles and are translated into vibrations that get increasingly more frequent the closer the user gets to the obstacle.
The next generation of tech advancements can go even further to help blind people.Autonomous vehicles, if built with the kind of intuitive AI voice-enabled assistive solutions like Amazon’s Alexa or Apple’s Siri that are already helping in the home, will give blind people increased independence. It is just a matter of making these solutions integral to design when developing the vehicles.
Smith tells me: “It just blows me away to the extent that gadgets have grown. I was so terrified when I got my first mobile phone, can’t even remember when it was, it was so long ago. Maybe 15 or 16 years. No speech though, had to use it by memory and hope for the best that you were turning it on and off correctly. And there was no way of texting. Then when Nokias came on the scene, then the iPhone, just unbelievable.” She adds: “It’s honestly fantastic some of the things that have been developed – although there is always room for improvement and advancement.”
Frank feels similarly: “I think it is all great. Compared to where we were 30 years ago there is no comparison. If there is any good time to be blind, it is now because of all of the advancements there have been with technology
It’s not just for the blind. Autonomous vehicles will have the capability to revolutionise access and liberate people who have limited mobility, while assistive technologies are being developed for all kinds of other impairments. From the stair-climbing Scewo wheelchair, to grip-adjusting bionic arms, technology is offering the biggest leaps forward in accessibility for years and has the ability to significantly improve the lives of so many.
- Some names have been changed.
source:https://www.theguardian.com