The American Sign Language (ASL) has been a primary communication tool for the deaf and mute community for more than two centuries now… however, mastery of the language remains rare among the general public.
Despite claims that technology has become more accessible, we have yet to see a truly intuitive, widely available solution that allows ASL users to express themselves via a device with the same speed and efficiency as speech recognition software or language translators.
Photo via Input Mag
This is why an idea created by an Indian student has wowed the public.
Priyanjali Gupta, a third-year Computer Science student at Vellore Institute of Technology in Tamil Nadu, recently posted on her LinkedIn profile how she developed an artificial intelligence (AI) model that can instantly translate ASL signs into English.
She said she built the model with Google's TensorFlow object identification API, which is a software interface based on one of the world's most prominent machine-learning libraries. Using transfer learning from a pre-trained model called ssd mobilenet, which has been able to translate signals so far.
She demonstrated her model's skills in her post by hand-signing a few common ASL terms, which were picked up by the AI and translated into English words almost instantly.
"The dataset was created manually by running the Image Collection Python file, which captures photos from your webcam for all of the American Sign Language signs listed below: She explained, "Hello, I love you, Thank you, Please, Yes, and No."
Photo via Zee News
Her LinkedIn article has generated over 60,000 reactions so far, with many people wondering about the model's design and methods.
In response to one critical comment, she stated that while she'd used a pre-trained model to create her own, she was sure that the open-source community will eventually be able to build on such notions to construct an AI better suited for more difficult jobs.
"Building a deep-learning model from the ground up specifically for sign identification is a difficult but not impossible challenge," she wrote.
"And right now, I'm just an amateur student, but I'm learning, and I feel that our own open-source community, which is far more experienced and knowledgeable than I am, will find a solution sooner or later, and perhaps we can create deep-learning models specifically for sign languages."
While the design doesn't appear to be anywhere near the level required for mass acceptance, it's nevertheless fascinating to see how young entrepreneurs are consciously catering to the special needs people.
Amazing!
By: Aishah Akashah Ahadiat