If you’ve ever read the work of William Shakespeare, you should be fairly familiar with the phenomenon of language change. Now, a recent study published in Cognition shows that artificial intelligence (AI) can help us better understand — and even predict — changes to a given language.
A team of researchers at Boston University is using AI to analyze hand shapes used in American Sign Language (ASL) and how the language has been evolving over time. The researchers found that the hand shapes used in common ASL words and phrases tend to be simpler and easier to produce, while those that are more complicated tend to be less frequently used. While there are clear differences between signed and spoken languages, the researchers believe these findings could tell us a lot about the way both types of languages evolve over time.
“You can make predictions about how languages might change over time, and test those predictions with a current snapshot of the language,” said Naomi Caselli, a professor of education and human development at Boston University.
Caselli and her team of researchers used AI to analyze around 2,500 signs used in ASL. The researchers classified signs according to the placement of the signer’s hands, fingers, and arms, using computer vision technology to detect minute differences. Since simple signs tend to be more common in the language, the researchers believe that the language has evolved to be a bit easier to use as a communication tool.
AI research focusing on signed languages like ASL has often been overlooked, especially in comparison to spoken languages. However, the researchers believe that conducting further research signed languages can shed further light on patterns in human cognition and language evolution.
“If all we study is spoken languages, it is hard to tease apart the things that are about language in general from the things that are particular to the auditory-oral modality. Sign languages offer a neat opportunity to learn about how all languages work,” Caselli said.