Following outcry, Google Translate removes offensive example phrase for Arabic entry

After outcry from Muslim advocates, Google Translate recently updated an entry for an Arabic word that reinforced offensive stereotypes toward Arabs and the broader Muslim community.

While brushing up on some Arabic vocabulary, one user pointed out that Google Translate’s entry for the Arabic word “تخطط” meaning “plan” was accompanied by an example phrase that means “planning to blow up the car.” Google addressed the issue shortly after the media and Council on American-Islamic Relations (CAIR) contacted the company, noting that the error was a result of offensive human language being replicated by artificial intelligence (AI) — a concern that’s been raised about Google Translate and other AI programs in recent years.

“This is a trusted source of information casually reinforcing stereotypes and in this case it’s the particular stereotype that Arabs and Muslims are somehow more inclined toward violence than other communities, which is patently false,” said Corey Saylor, research and advocacy director at CAIR, in an interview with Newsweek.

A user posted this screenshot to Facebook, prompting outcry from groups like CAIR. (Screenshot by Clayton Chiarelott via Facebook)

This isn’t the first time Google Translate has been the center of such controversy — last year, one Google Translate user took to Twitter to highlight gender bias on the system. Google has publicly addressed these concerns numerous times in the past couple of years, noting that it’s aware of the offensive bias in some of its translations and is working to reduce it.

Such bias, while concerning, is not entirely the developers’ fault — because machine translation systems are trained on huge datasets of human language, they may replicate human biases. A similar problem is present in large language models, which have been the subject of debate for their potential to augment hate speech. Experts on the technology have stated that developers must consider the consequences of replicating human bias and work toward reducing it as much as possible.

“Google Translate is an automatic translator, using patterns from millions of existing translations as well as user queries to help decide on the best translation and autocomplete suggestions for our users,” Google said in a statement apologizing for the most recent error. “Unfortunately, some of those patterns can lead to unintentional autocomplete suggestions.”

Google removed the offensive phrase from the word’s entry shortly after receiving criticism from groups like CAIR.

“We welcome the swift resolution of this issue and hope measures will be implemented to ensure that translation services do not produce such stereotypical results for any language,” said Nihad Awad, the national executive director of CAIR. “Stereotyping often results in the type of bias that negatively impacts all minority communities.”

Andrew Warner
Andrew Warner is a writer from Sacramento. He received his B.A. in linguistics and English from UCLA and is currently working toward an M.A. in applied linguistics at Columbia University. His writing has been published in Language Magazine, Sactown Magazine, and The Takeout.

RELATED ARTICLES

Weekly Digest

Subscribe to stay updated

 
MultiLingual Media LLC