SignAloud Glove, Google and Waverly Labs all look to make communication easier.
According to the 2010 U.S. Census, nearly one in five Americans has a disability and according to Steven Camarota, director of research for the Center for Immigration Studies, one in five Americans doesn’t speak English at home, and less than half of those who speak English at home are speaking it less than proficiently. Innovators are increasingly looking for ways to help them communicate, and often, the solutions they find benefit everyone, not just the target audience.
Navid Azodi and Thomas Pryor, two undergraduates at the University of Washington, were awarded a $10,000 Lemelson-MIT prize for their work on the SignAloud Glove. The SignAloud Glove is in its early stages and translates American Sign Language (ASL) to English via bluetooth. Each of the glove’s fingers includes specially-embedded sensors that measures precise movements. Azodi was inspired to make the product because he didn’t speak for the first seven years of his life and wanted to find a way to help others communicate. Anyone can benefit from using the SignAloud Glove because it gives immediate feedback to students learning to sign, and the pair eventually plans to have it translate ASL to other languages.
Those struggling to communication with people in other languages will soon be able to via Waverly Labs’ Pilot, a new type of wearable technology that translates languages through an earpiece. The device comes with an additional earpiece and for streaming music and an app for users to toggle between languages. Perfect for those struggling to communicate abroad — or with a colleague in your own office — the device keeps guess work at bay and limits the amount of communication lost in translation.
Last year, Google launched the Google Impact Challenge to help those with disabilities. The company worked with nonprofits to raise money and awareness and identify solutions for those living with disabilities. Beyond the company’s work with other organizations, Google has worked hard to make the Android completely usable by voice, and has worked with developers to create accessibility guidelines for contrast, helping those with sight problems (and everyone while they’re sitting on their phone in the sun). Next on the company’s agenda: Collecting more microdata for helping those with sight problems navigate better indoors (think locating stores in a mall rather than buildings on a street) and non-language processing (such as recognizing laughter, sirens and the sound of rain for the hearing-impaired).
—Learn More: http://www.washington.edu/news/2016/04/12/uw-undergraduate-team-wins-10000-lemelson-mit-student-prize-for-gloves-that-translate-sign-language/,