The last few years have seen a tremendous rise in translation tools, but sign language has largely remained a bridge too far. A team from the University of Arizona believe they may have cracked it though.
They’ve developed a new product, called Sceptre, that allows armbands to communicate with a computer that can translate sign language motions into words for those who don’t speak the language.
Decoding sign language
The researchers used the armbands to train the system on the various sign language gestures, which are then matched up with the correct word in the database.
The hope is that the system will support communication, especially in emergency situations where prompt communication is so important.
The armbands were developed by Myo and track motion using an inertial measurement unit, and finger configuration using electromyography sensors.
The software was trained to accurately detect and decipher sign language gestures together with signs for particular letters and numbers.
The system was originally trained with 20 different gestures, with the software capable of accurately translating the gestures 98% of the time.
You can see the system in action in the video below, where the authors highlight how the computer can detect the words being signed.
At the moment, the signs are merely shown as text on the screen, but the team hope that the words will eventually be read out loud by an app to support conversation flows.
“Ideally, the person can use this anywhere they go,” the researchers say.
Before the system is fit for market, there are a few issues to overcome, such as the process of calibrating the sensors so that they work regardless of where they’re placed on your body.
The system also needs to be trained to cope with the variability of peoples sign gestures, with differences both between users but also by the same user in different circumstances.
It’s undoubtedly a nice technology however that is making clear strides towards making communication for deaf people considerably easier.