Machines Are Your Friends: The Future is Now

Machines Are Your Friends: The Future is Now

Andrew Warner

Jaime Punishill

Jaime Punishill is chief marketing officer at Lionbridge and is responsible for leading global marketing and proposition development efforts. He earned a Bachelor of Arts in history and political science from Stanford.

Andrew Warner

Will Rowlands-Rees

After more than 15 years of working in research business, Will has been responsible for leading product and services development as chief product officer at Lionbridge since the beginning of 2021.

Andrew Warner

Jaime Punishill

Jaime Punishill is chief marketing officer at Lionbridge and is responsible for leading global marketing and proposition development efforts. He earned a Bachelor of Arts in history and political science from Stanford.

Andrew Warner

Will Rowlands-Rees

After more than 15 years of working in research business, Will has been responsible for leading product and services development as chief product officer at Lionbridge since the beginning of 2021.

In the July/August issue of MultiLingual magazine, industry icon and TAUS founder Jaap van der Meer wrote a fascinating and insightful piece called “Translation Economics of the 2020s.” Jaap is an industry veteran since its inception during the dawn of the PC era. As such, his perspective on its past, present and future is born of a deep connection to the industry’s ebbs and flows.

The debate that followed was robust, at times contentious, and will no doubt rage for the foreseeable future, though we won’t weigh in here. We are pretty sure you will find others doing just that elsewhere in this magazine. In fact, we would argue that it’s hard, nay impossible, to predict the future of the localization industry — much less when, at whose expense, and so on. That said, there is no doubt that it’s undergoing yet another seismic shift. And like the many technology-driven shifts in the past, this one promises to help build bridges between companies and their global customers,  while also posing major challenges to work procedures, value propositions for stakeholders, pricing, and the industry’s favorite topic of discussion — quality.

In its relatively short life, the language business experienced several step-function advances in key technology capabilities that produced tectonics changes. Thanks to major advances in artificial intelligence (AI), we’ve arrived at the next key inflection point that will cause tremendous change, disruption for many and tremendous opportunity for those who recognize and capitalize on it. We promise, this isn’t another “Why you should pay attention to artificial intelligence” article. From where we sit, that debate is over. An AI-powered future for localization isn’t just inevitable, it’s already here. We are well into this new age. 

In imagining how AI will change the language business, it is instructive to consider how almost all technology cycles manifest. There are hundreds of business books from Schumpeter to Christiansen to Brooks to Naim that chronicle this far better than we could. One of the consistent themes is how we view an emerging technology. Humans are fantastic at describing problems and highlighting pain. On balance, though, we’re simply not good at picturing a radically different future. We use existing frames of reference, which limit our understanding of new possibilities. Perhaps there is no better example than the Henry Ford aphorism, “If I had asked people what they wanted, they would have said faster horses.” People perfectly captured the need for increased speed to cover distances quicker. Horses were the dominant means of individual transportation, hence the request for a “faster horse.”

This is a critical lens for understanding how AI will change this industry — as well as the debate inferno stoked by Jaap’s article. It brings this new technology — and our organizational and strategic response as key players — into better focus.

The most common points for discussion, debate, and violent disagreements are the following:

  1. Can AI produce content or translations that are at parity with human translators? 
  2. Will word rates keep falling/how will translators make money/how will work get paid for and what is a fair price for delivered value?
  3. Will translators/interpreters go extinct?

In our view, these are “faster horse” questions. They are about recognized pain and heightened concerns. They distract from the bigger picture. Instead, we should be discussing these questions:

  1. Is the demand for building language and cultural bridges being satisfied? Will demand continue to grow? 
  2. Depending on how you answer question number one, you then need to ask how the industry will adjust. How will we meet increasing demand? Alternatively, how will the ecosystem survive a flattening of the curve?
  3. How can we achieve our digital transformation moment? What will it take for businesses to measure, see, and operate as if world readiness wasn’t optional?

If you believe we are a long way from satisfying the demand curve, then how can we scale to meet that demand? How can we localize every step of the customer’s journey? How can we reverse the dynamic where most of the world’s content is produced only in English, locking out the growing number of non-English-speaking internet users? How will we help find the “next billion users?” How can we cost-effectively support low-resource languages, emerging economies and non-official and non-primary languages spoken in every country? 

The answer lies in challenging our assumptions, adopting and then optimizing new technologies in new ways for both transformative and additive purposes. A great place to start is to identify three lines of thinking that hold us back.

Quality is everything

We are an industry obsessed with translation quality. Some might argue we do this to a fault. All projects — in this industry or any other — must balance time, cost, and quality. Quality is no doubt important, but it’s a first-world business metric and highly subjective. There is no one “right way” to write a sentence, much less translate one. Our ISOs, factory mentality and linguistic purism stand in the way of making language more accessible and business more inclusive. 

To achieve “world readiness” as Jaap describes it, the industry needs to accept that cost, speed, and scale matter as much as — if not more than — quality. Market by market, use case by use case, culture by culture, risk tolerance by risk tolerance, quality must become flexibly defined to balance the growing importance of the other three dimensions. 

This is not unique to this industry. The tailors of yore protested, then lamented the loss of quality in clothing production when machines arrived. But if we had insisted on maintaining the quality standard they set, most of us would not be wearing the clothing we wear today. Not every outfit needs to impress the Queen. Anyone who used the first few generations of iPhones knows they were not very good as phones. Network quality, telecommunications technology, and antenna design meant lots of poor reception and dropped calls. That didn’t change the seismic impact the iPhone and other smartphones have had on our lives. To this day, and likely forever more, a mobile call isn’t as high-quality a connection as a landline call, but that matters little to people in countries without major landline infrastructure. It also matters little to most people who value the instant-on, always-there nature of cellular technology. Those new dimensions matter more than achieving parity with landline quality. Importantly, this assertion does not mean that call quality doesn’t matter. Some calls need to be perfect, such as when recording for use in a video or for discussions about strategic issues between two governments. Some calls need the security of dedicated telephony or encryption. Call quality acceptance had to become flexible to enable call ubiquity. And of course, the quality of even the poorest phones has increased to levels that allow these trade-offs to happen. As the baseline quality of machine-only translation improves — whether it’s TMs or NMT — and thus becomes a matter of fit and appropriateness, then scale, speed and cost will make exponential improvements. My article “Future of Localizaton” in the January issue of MultiLingual discussed this in greater detail. Plenty of content will still need and merit the skills, experience and wisdom of human translators, while much more content will be translated and localized for the first time, as entire categories move from human to machine only or machine-mostly effort.

MT = AI

Much of the discourse around machine translation (MT) centers around parity of output to human translation. Putting the above content quality debate aside, if you consider the role of translation memories, other linguistic assets, and even unsanctioned translator use of MT engines, you understand that we aren’t talking about humans versus machines. We haven’t for a long time. What is really happening is how much credit and value should be ascribed to the tools versus the person leveraging the tools. 

Central to this debate is whether machines can only effectively produce puzzle pieces that may or may not be part of the same puzzle, or whether they will be smart enough to assemble the entire puzzle without human fit and finish. 

Lost down that rabbit hole of machine versus human value is the notion that that while MT = AI, AI => MT. Multimodal AI — blending image, video, and text analysis — promises to add entirely new dimensions on context. Even as we debate machine versus human translation, the T in MT is shifting from being just translation to transcreation. Will machine transcreation achieve human transcreation parity? Probably not, but that’s irrelevant. What it certainly will do is open the door to not merely translate, but transcreate almost everything. Whether that transcreation is delivered untouched by human hands is unclear. But go a little further down that hole, and we will find machine transcription. Already speech-to-text is being blended with MT and then text-to-speech engines, which will open up the world of video and multimedia. We have only just begun to see the applications of AI to the work of transforming one source package into another. 

Stepping back from the actual source transformation are many other applications for AI. The work and skills of linguists can be identified and scored. AI will be used to identify the ideal linguists based on their experience. Perhaps you need the ideal translator within a time or budget constraint. The AI can find that translator and automatically route the work to that person. In fact, AI will help us determine the best workflow to achieve the desired business outcomes for this content transformation.

“Let’s face it, we have all been blamed for poor translations
that were entirely the result of poor source content.”

Don’t touch the source

Virtually every aspect of this industry starts only after source content or a source file has been authored. Many of the challenges we struggle with every day emanate from poor source. Let’s face it, we have all been blamed for poor translations that were entirely the result of poor source content. Few localization teams, much less language services providers, feel empowered to push back on source quality. Like great chefs making gastronomic delights with inferior ingredients, the best translators, localization engineers and project managers often produce amazing target content that actually fixes flaws inherent in the source. Collectively we bear the burden of poor source.

Not only does this create some of the tension over word rates, turnaround times, and perceived quality issues, but this is a major limiting reagent to localizing everything. And most importantly, the more automation that is added to any process, the less forgiving that process is to imperfections. Put differently, if we don’t fix the source content, all AI will do is help us scale badness at heretofore impossible speeds. 

Perhaps the most useful and impactful application of AI will be source optimization, and even source generation. For the first time in our industry’s history, we can examine source at scale then grade it, tag it, sort it, and enrich it. This will help us pick the right process for transformation. This will help human translators produce superior target content. This will generate the metadata needed to train and guide the machines. Finally, this will inevitably lead to a shift in our relationship with the creators of the source. 

The beginning of the source discussion will center on optimization. It will end with source generation. Large-scale natural language generation (NLG) engines like GPT3 are improving output at staggering rates. Why transform something when you can create it natively? Isn’t that what marketers do today, choosing between translation, transcreation, and native content generation? Isn’t our industry perfectly positioned to train, tune, optimize, and operate NLG engines working with language experts to tune the final output as needed?

This is merely the beginning of the AI epoch for our industry. We are not well served by fighting over yesterday’s value creation. No one buys translation. They buy understanding; they buy meaning; they buy increased sales. They buy better customer comprehension and better customer experiences. If we stop to remember the goal to enable ALL businesses, governments, organizations, and people to connect, communicate, buy and sell to all other people then AI is hardly a threat. It’s the key enabling the technology of our time.

This is the first in an article series on the future of localization. 

RELATED ARTICLES

WEEKLY DIGEST

Subscribe to stay updated between magazine issues.

MultiLingual Media LLC