Consider having a language tutor who is always ready to teach new words or monitor a student’s progress.
In collaboration with the American Society for Deaf Children and the creative firm Hello Monday, Nvidia unveiled an artificial intelligence-based language learning platform on Thursday that aims to accomplish precisely that for American Sign Language learners.
To illustrate signs, the Signs platform uses a three-dimensional avatar. While using the platform, users maintain their video cameras turned on, and an AI tool gives them feedback while they practice the signals. Nvidia intends to increase the number of unique signs on the platform from the first 100 to 1,000.
One of the numerous ways AI is advancing work on assistive technologies—tools made to help the elderly, disabled, and those who care for them—is through Signs. Apple offered AI-enabled eye tracking to assist physically impaired users in navigating their iPhones, while Meta, Google, and OpenAI have utilized AI to enhance functionality for blind or low-vision users. These developments, according to blind users, are already helping them get by in life and at work.
According to the organizations that created Signs, American Sign Language is the third most widely used language in the US, behind Spanish and English.
Additionally, the ASL-learning platform serves as a reminder that Nvidia has been attempting to expand beyond AI technology. By creating the chips that the majority of businesses use to power the technology in addition to its own AI models and software platforms, Nvidia has emerged as a significant supplier to the AI sector. As AI businesses that hail the technology’s future promise purchase up large quantities of Nvidia’s processors, the company’s stock has risen more than 100% in the last year, boosting its valuation to almost $3.4 trillion.
Nvidia is dedicated to developing AI products not only for business clients but also to promote useful applications for the technology, according to Michael Boone, manager of trustworthy AI products at the company. Boone stated in an interview that it is crucial to create initiatives like as Signs since the objective is to enable the ecosystem, not simply a single company or group of companies.
Furthermore, Nvidia’s primary chip-making business benefits from more individuals utilizing AI in any way. Concerns have been raised in recent months by certain investors on whether tech companies have been overspending on chips and other AI infrastructure, as well as the potential time horizon for return on investment.
In order to expand the vocabulary of Signs, which is free to use, ASL speakers will be able to upload videos of signs that aren’t currently on the site. With the use of that information, Nvidia may eventually be able to create new ASL-related products, such as better gesture control for automobiles or sign recognition for video conferencing software. The data repository will also be made freely accessible to other developers, according to the corporation.
According to Nvidia, the team behind Signs is investigating ways to incorporate non-manual signs that are essential to ASL, like head motions and facial expressions, as well as slang and regional variances in the language in future generations.
The majority of hearing parents give birth to deaf children. Cheri Dowling, executive director of the American Society for Deaf Children, said in a statement about the new project, “Providing family members with easily accessible resources like Signs to begin learning ASL early allows them to open an effective communication channel with children as young as six to eight months old.”