Language Models for English, German, Hebrew, and More

For quite some time now, artificial intelligence (AI) researchers have been trying to figure out how — or perhaps if — computers can be trained to generate natural, coherent, human-like language.

And it looks like they finally can. Well, kind of.

A new report from WIRED explores the massive language models developed by companies like AI21 Labs, OpenAI, and Aleph Alpha, among others. These language models, led by OpenAI’s massive GPT-3 model which was the first to launch back in 2019 (as GPT-2), are capable of producing long strings of fairly complex text — think emails, recipes, even blog posts — on a given subject.

“Recent strides in AI show that machines can develop some notable language skills simply by reading the web,” writes WIRED’s Will Knight. These language models utilize massive amounts of text derived from the internet and other sources which can be used to develop an understanding of the statistical relationships between different words, parts of speech and other elements of the sentence structure of human language.

“But GPT and its ilk are essentially very talented statistical parrots,” Knight writes. “They learn how to recreate the patterns of words and grammar that are found in language. That means they can blurt out nonsense, wildly inaccurate facts, and hateful language scraped from the darker corners of the web.”

Last week, MultiLingual reported on AI21 Labs’ Jurassic-1 Jumbo Language Model, which has been described as the largest of these language models to date — it’s got a vocabulary of around 250,000 lexical items, and unlike some of its competitors, AI21 Labs’ language model is available for free to internet users around the world.

Language models like AI21 Labs’ and OpenAI’s are quite competent in English, though of course, they do have moments when they fall short — after spending about half an hour exploring the AI21 Studio (where users can access Jurassic-1 Jumbo for free), we found that it sometimes did spew out rather confusing or ungrammatical phrases. For example, after generating numerous headlines for blog post ideas the user input, the language model eventually responded “Wake Up, Get A Job, and WRITE LIKE KAFKAESQUE OR ELSE.” All in all though, the model generally produced quite coherent and interesting strings of text.

Now that the models appear to have developed a quite complex understanding of English, start-ups are moving onto other languages — WIRED’s piece notes that language models have been developed for or are currently being developed for languages like Korean, Chinese, and German. Heidelberg-based Aleph Alpha’s language model, for example, is actually able to produce text in five languages: German, English, Spanish, French, and Italian.

RELATED ARTICLES

Andrew Warner
Andrew Warner is a writer from Sacramento. He received his B.A. in linguistics and English from UCLA and is currently working toward an M.A. in applied linguistics at Columbia University. His writing has been published in Language Magazine, Sactown Magazine, and The Takeout.

Weekly Digest

Subscribe to stay updated

 
MultiLingual Media LLC