AudiologyOnline Phone: 800-753-2160


Widex SmartRic - February 2024

Kinect Sign Language Translator Expands Communication Possibilities for the Deaf

Share:

Worldwide, an estimated 360 million people are deaf or hard of hearing. Because the majority of hearing individuals do not understand sign language, people who are deaf often have difficulties interacting with the hearing. While other methods exist, researchers hope to make translation even easier with a cost-effective, efficient prototype that translates sign language into spoken language—and spoken language into sign language—in real time.

Asia Answers the Call

Dedicated researchers in China have created the Kinect Sign Language Translator, a prototype system that understands the gestures of sign language and converts them to spoken and written language—and vice versa. The system captures a conversation from both sides: the deaf person is shown signing, with a written and spoken translation being rendered in real-time, while the system takes the hearing person's spoken words and turns them into accurate, understandable signs.

Kinect Sign Language Translator user interface

Kinect Sign Language Translator user interface.

This project was a result of collaboration, facilitated by Microsoft Research Connections, between the Chinese Academy of Sciences, Beijing Union University, and Microsoft Research Asia, each of which made crucial contributions.

Professor Xilin Chen, deputy director of the Institute of Computing Technology at the Chinese Academy of Sciences, has spent much of the past decade studying sign language recognition, hoping to devise a way to enable signed communication between people with hearing loss and their hearing neighbors. “We knew that information technology, especially computer technology, has grown up very fast. So from my point of view, I thought this is the right time to develop some technology to help [the deaf community]. That’s the motivation,” Chen explained.

Enter Microsoft Kinect

Motivation met action when Kinect for Xbox came on the scene. Originally developed for gaming, the Kinect's sensors read a user’s body position and movements and, with the help of a computer, translate them into commands. It thus has tremendous potential for understanding the complex gestures that make up sign language and for translating the signs into spoken or written words and sentences.

The November 2010 release of Kinect stirred tremendous interest in the research community. That interest intensified with the June 2011 release of the Microsoft-supported Kinect for Windows software development kit (SDK), which helped make the technology broadly available for scientific use. Microsoft Research Connections was eager to encourage the most promising uses of Kinect, but with so much fervor over Kinect in the research world, it was hard to select which projects to support.

Stewart Tansley, director of Natural User Interface at Microsoft Research Connections, turned to Microsoft Research’s worldwide labs, asking them to submit the best Kinect academic collaborations they had under consideration. Microsoft Research Asia submitted the work of Principal Researcher Ming Zhou—who was heavily involved in natural language models and translation and had forged a tight collaboration with the Chinese Academy of Sciences. The project was just what Microsoft Research Connections was looking for.

Complementing Chen’s group at the Chinese Academy of Sciences were Zhou and other senior researchers from Microsoft Research Asia, where a great deal of automated translation work was already underway, including advanced research into real-time machine translations of English to Mandarin.

Students of the special education school at Beijing Union University try out the Kinect Sign Language Translator prototype

Students of the special education school at Beijing Union University try out the Kinect Sign Language Translator prototype.

Also essential to this project was the participation of the special education program at Beijing Union University. “One unique contribution of this project is that it is a joint effort between software researchers and the deaf and hard of hearing,” Zhou says. “A group of teachers and students from Beijing Union University joined this project, and this enabled tests of our algorithms to be conducted on real-world data.”

Testing the system

Among the student participants was Dandan Yin, a dynamic, accomplished young woman who is deaf. An especially proficient and graceful signer, Yin told the researcher team that working on this project was the fulfillment of her childhood dream “to create a machine for people with hearing impairments.”

Watching Yin sign back and forth with an avatar, you can see the potential future of communication between people who are deaf and those who can hear. And while all the collaborators are quick to stress that the Kinect Sign Language Translator is a prototype, not a finished product, all are equally vocal in expressing their belief that it has the potential to provide a cost-effective and efficient means of communication between those who are fluent in sign language and those whose signing is limited to crude gestures.

Tansley conjures up the scenario of a deaf person visiting a physician who doesn’t know sign language. While acknowledging that the patient could pre-schedule an interpreter or resort to communicating with paper and pen, he observes that such interactions “…would be very artificial. But with this technology, they could simply use their natural sign language.” Thus a signer would be empowered to communicate independently with a non-signer without scheduling an interpreter or resorting to other methods.

Tansley relays that the system could even open up new job opportunities for deaf people. “Imagine an information kiosk, say, at an airport, and rather than the person seeking information being deaf, imagine that the person staffing the information kiosk was deaf. Now, a hearing person could come to that kiosk and ask questions of the deaf person and wouldn’t have to understand or use sign language…the system could help them communicate.”

Those scenarios don’t seem too far off, thanks to the dedicated researchers and partners who are working to make the Kinect Sign Language Translator a reality—and, in the process, fulfilling the childhood dream of Dandan Yin and millions of other deaf and hard-of-hearing people in China and around the world.

Source: https://research.microsoft.com/en-us/collaboration/stories/kinect-sign-language-translator.aspx

For more information, you may also visit:

Microsoft Blog: https://blogs.technet.com/b/next/archive/2013/10/30/sign-language-translator-uses-kinect-as-a-bridge-between-the-deaf-and-hearing.aspx#.UnkUH73TmUl

YouTube: https://www.youtube.com/watch?v=HnkQyUo3134

Rexton Reach - November 2024

Our site uses cookies to improve your experience. By using our site, you agree to our Privacy Policy.