Workflow

Component-Based Systems 

A cure for the common TMS cold

By Didzis Grauss

O

ver the past few years, we’ve observed a surge of translation management systems (TMSs) as the go-to, all-in-one solutions for global content management. Through massively financed sales and marketing teams, TMS companies have become trendsetters and thought leaders on the best ways to manage content and avoid landmines on the path to global greatness. On the financial side, they have become a very juicy fruit for investors that look to reap the benefits of their popularity. The narrative has gone as far as to question whether buyers even need to employ language service providers (LSPs) and freelancers when everything and anything could be done with a TMS super product.

The idea of a completely centralized global content hub sounds appealing on the surface, but what happens when that is not enough and the TMS itself becomes a liability? Organizations are generating more content than ever thanks to artificial intelligence (AI), and the amount of content is becoming overwhelming. This shift demands a transformation in how our industry manages and delivers content to audiences around the globe.

Advertisement

Recently, discussions around the concept of a “post-TMS” world have taken place on social media, with industry experts sharing opinions on what that era might look like:

  • Spence Green of LILT proposed a few characteristics of a concept that eliminates the need for TMSs altogether. “Non-linear transformation workflows, real-time feedback loops for AI model fine-tuning, and hierarchical language data storage that supports retrieval for generation” were named, among others.
  • Kirill Soloviev, co-founder and CEO of ContentQuo, shared a similar opinion that includes “accepting that quality issues will happen, and deciding to get better at tracking them and faster at learning from them.”
  • Jochen Hummel, co-founder and CEO of Coreon, introduced the idea of a “language factory.”
  • Bruno Bitter from Blackbird.io is building a conceptual backbone of localization that prioritizes interoperability of systems and decoupled linguistic asset management with services in the loop.
  • Finally, in its bi-weekly newsletter, CSA Research shared a graph showing the landscape of localization services, with “Peak Localization” marked through 2023. The graph, however, indicates an important change in the landscape to consider going forward — blurring the lines between text, audio, and visual localization.

Before I dismiss the idea of a TMS as a whole, for me, this spells a clear call for flexibility of language technology components. When we think about components that enable fully immersive product localization, we first think of detaching features that surround a TMS suite. You might ask why, and the answer is simple: it allows us to implement the features that we currently need instead of paying for a suite that includes features that might not see a day of use, ever. In addition, the components that would enable a product to become fully localized might be cross-functional. Besides text, a localization program might need to support multimedia assets. To top it all off, as the content surrounding the product grows, so do the entry points that require the support of localization. This shift in requirements and the growing needs of global support management strongly indicate that a TMS alone is not the answer.

Advertisement

TMS-centric localization

Figure 1 reflects some of the largest issues in managing TMS-centric localization programs, which cause bottlenecks and lack cross-department collaboration.

Global content scaling used to be about the number of languages a product is able to support. However, as companies grow, so does their content structure. Combine that with the speed and volume of content generation that AI enables, and we’re now dealing with an equation that essentially multiplies the number of languages by the number of content entry points. This means that a TMS-centric localization now needs to support not just the language count — a substantial consideration on its own — but also content coming in from an increasing number of sources throughout the organization.

After discussing this with a sizable pool of start-up and scale-up founders and localization managers, the overwhelming consensus is that this is where a localization infrastructure centered around a TMS is starting to fall off. Buyers that increase the number of content-based services around their product beyond TMS capacities again are facing manual string management and content labor that the TMS was introduced to eliminate in the first place.

Component-based localization

Instead of making the TMS a centerpiece, organizations can use it as part of a larger content ecosystem and as a component for string management, as illustrated in Figure 2. This eliminates the bottleneck of content flow and empowers product teams to handle localization at their own pace and on their own terms. In addition, in-house localization teams are no longer overloaded with manual string management, but instead function as an organizational support communication facilitator and fulfill a strategic role for content operations as a whole. For organizations without localization teams, an LSP can fill that role effectively.

In a component-based system, integrators are the backbone for the whole infrastructure. An integrator can be visualized as a highway that creates two-way exits and entrances for components, as well as purposeful loops that lead content through different stages before returning back on the highway and finally arriving at the final destination: the customer. I also like to think about it as a physical backbone that connects the nerves throughout the body and creates a unified, living organism of interlocking content.

Beyond content fluidity, there are other benefits of applying component-based thinking when it comes to localization program management. Some of them are financially beneficial, while others allow for unlimited feature and component testing. Having the choice and the ability to add and remove components needed at a particular point in a product’s content journey enables companies to handle costs associated with each component. Let’s say you need to generate a synthetic voiceover of your blogs to increase exposure and enable your users to consume your content on the go. This is not a feature you’ll need indefinitely. An integrator easily adds this functionality to your ecosystem, and when the job is done, removes the feature that is no longer being used.

We can also look at machine translation (MT) functionality as an example. An integrator-first structure allows companies to add, test, and evaluate MT engines to find the right fit for their language and content specifics. While this is partially enabled by a TMS, the versatility really opens up when we look beyond the offerings provided through a TMS.

Advertisement

Effects on operators and buyers

Thinking about a localization program on a component level is a great way of gaining back control over the assets and workflows we seem to have surrendered over to systems that provide a one-size-fits-all solution. Let’s look at what it means from the language operator and language buyer perspectives.

The age of AI has disrupted the localization industry quite heavily and, while we see success stories from the top companies, there are a number of sad stories to be told as well. For example, an article from Slator reported that “Over a Dozen Dutch LSPs File for Bankruptcy as WCS Group Collapses.” This indicates a larger issue where a portion of LSPs are struggling to reidentify the value they can bring to the market of translation buyers. A lot of the usual legwork is automated and, from the outside, it looks like the translations are performed by AI or freelancers directly.

While this is partially true, the disruption of value is a great opportunity to strategize and reposition to match the growing need to make sense of these structures. LSPs are in a great position to become experts on workflow orchestration, technology evaluation, market consultancy, and specific subjects. Our industry is thirsty for independent consultants and technology specialists that can guide buyers through the myriad software marketing campaigns and sales efforts to land on localization products that actually make sense for them.

While the freelancer market is similarly affected by AI productivity tools, the disruption of value happens to a lesser extent. Freelancers might be required to introduce more AI and MT tools into their vocabulary, but the industry has gravitated to working with freelancers directly, and there will always be value in that.

Localization buyers will find a lot of potential for value in deploying a component-based structure. Introducing a TMS-centric localization structure immediately creates a silo in which global content operations are secluded from the rest of the company. On the other hand, adding localization components feels more like an extension of the existing tech stack.

Looking beyond bottlenecks, investments in any feature, including localization, has to make sense financially. This is particularly important for start-ups and scale-ups, where every penny spent is tied to its return on investment (ROI). It is a lot easier to justify the money spent on each component because each one corresponds to a certain business metric. When considering the financial and workflow factors together, we land on a flexible localization ecosystem that feels natural and frictionless to the whole organization and is financially purposeful.

Conclusion

Language operations look a lot different than they did a few years back when TMSs were on the rise. Trends in content creation, management, and delivery dictate new rules on how we as an industry will manage global content going forward. While I doubt there will be a complete dismissal of TMSs in the near future, their importance has certainly shifted into becoming a role player, rather than a centerpiece. Going forward, a TMS will be valued for its core functionality of string management, its robustness, and its ability to play well with others, as well as whether its price — for what it actually does — is justified. 

Didzis Grauss is co-founder of Native Localization, a company that specializes in creating component-based localization programs for early-stage and series-A startups. Didzis was named a LangOps pioneer and works on concepts related to language operations.

Related Articles