RADAR-BASED COMMUNICATION & INTERPRETATION OF AMERICAN SIGN LANGUAGE
Spreading Awareness & Understanding Of ASL Through Technology
THE PROBLEM
American Sign Language (ASL) serves as the predominant language of deaf communities in the United States and Canada, helping approximately 500,000 people communicate through facial expressions and hand motions. Unfortunately, most of the hearing population does not learn ASL, which creates comprehension and translation issues when interacting with deaf individuals. This knowledge gap can lead to misunderstandings and frustration for all parties, especially within family units.
OUR SOLUTION
Researchers at The University of Alabama have developed a system that utilizes radar to recognize hand gestures and a 3D camera to interpret other forms of nonverbal communication such as facial expressions. The collected data is then translated into text, symbols, and audio, aiding in better communication between deaf and hearing populations. The radar portion of the system can operate without the need for additional equipment such as lighting or signing gloves, which allows more privacy and reduces logistical hassle. The applications of this novel invention are far-reaching in education and healthcare, strengthening the interactions between deaf and hearing individuals.
THE ADVANTAGES
- Small size for increased convenience
- Works in the dark; no additional lighting is needed
- Non-contact; does not require signing gloves or touchscreens
- Increased accessibility due to longer range & skewed aspect angle tolerance
- More sensitive to privacy than traditional video methods