Skip to main content

ASL: A Language Ready for Translation

Graduate student Nathan Browne dedicated his thesis to creating an AI model that translates ASL by looking at eyebrows. Now, he hopes his findings can shine a light on similar technologies to come.

Anyone who has ever learned a second language knows the power of Google Translate. However, for those who study sign language, translation tools are few and far between—and even then, very inaccurate. Graduate student Nathan Browne (Linguistics ’25) discovered this for himself while studying American Sign Language (ASL) at BYU and used his experience to motivate his master’s thesis, which focuses on improving AI translation by concentrating on eyebrow movement.

Speaking with Your Body

Women talking across a table
Photo by Unsplash

Improvements in AI have allowed for the creation of more efficient translation technology in a variety of languages. However, when it comes to ASL, the development of AI translation tools proves much more difficult—in large part because signing relies on movements that can’t be written. Communication instead relies, in part, on hand and arm orientation along the torso as well as one’s facial expressions, making ASL a difficult language for computers to comprehend.

Analyzing videos of sign language comes with setbacks of its own, as computers typically take in camera angle, background distractions, and speaker appearance. Browne says, “Unless you tell the computer what to ignore and what to pay attention to, it takes all [the information] in,” even insignificant details.

To minimize distractions, Browne trained an AI model to focus on one specific part of the speaker—their eyebrows. “When you’re asking a question in ASL, if you raise your eyebrows, that indicates that it’s a yes-or-no question,” he notes. “If you furrow them, it’s an open-ended question.” Using a data set of 250 videoed ASL questions, Browne then compared his program’s ability to correctly identify yes-or-no questions using eyebrow movement to other models’ abilities to do the same while looking at the whole body.

The results from the full-body and eyebrow analyses were marginally different, with the full-body program outperforming his trained model ever so slightly. “The eyebrow-focused model only used 2% of the data that the full-body model used and still got almost the same performance,” Browne says. This finding has encouraged him to continue training AI using specified movements associated with sign language.

Woman signing in ASL
Photo by Pexels

The Impacts of AI

Browne attributes the success of his study to the specific linguistic parameters—eyebrow raising with questions––he so meticulously sought out. This process required he manually check three thousand videos, deleting those with technical problems until only the usable ones remained. Though his final data set was significantly smaller and of lesser quality than he initially anticipated, it still showed that training AI can influence efficiency of ASL comprehension.

Specifically, Browne explains that this research taught him the importance of slowing down to find solutions to big or complex problems. “In the field of computer science and computational linguistics, everything being done is cutting edge, and people are trying to get better and better results,” he concludes. Browne finds it more important, however, to take a step back and consider the problem at hand; then “we can move forward with more confidence, not just chasing better, but creating something that actually makes a difference.”

Learn more about the linguistics master’s program here.