We envision translation as an ubiquitous service — so goes the TAUS mission statement. However, this reality will likely only be possible with systems interoperability. This occurs when diverse systems can exchange and process information without human intervention. For ubiquitous translation we would need wide-scale systems interoperability across the industry. We are far from that right now.
Before diving into the interoperability landscape, let’s first examine the value of having translation as an ubiquitous service. The idea is that in a knowledge-driven, interconnected and globalizing world, translation is a basic human requirement — the communications oil to smooth and fuel the workings of commerce and society. Most sales funnels aren’t the same vertical drop they used to be. McKinsey & Company has done a good job of articulating this change through its work on the consumer decision journey. We’ve adapted this exhibit to create Figure 1. From the top blue arrow we see that consumers’ decisions are affected by their search and review processes. With Web 2.0, we’re firmly situated in a user-driven rather than publisher-driven world, and most companies are still playing catch-up to this reality. The bottom blue arrow outlines the range of relationship management tools that companies need to employ if they want happy and loyal customers.
At each stage in the consumers’ journey, there are established and new opportunities for the language services industry, from multilingual search engine optimization to sentiment mining, all the way around to real-time multilingual chat. It’s clear that there are tremendous growth opportunities in the years ahead.
The usual progression for most new revenue opportunities is that they are accepted as labor-intensive tasks for a while before the pressure to automate comes to the forefront. The greater the degree of interoperability between content creation and delivery systems, the more a company is able to harness the potential of the resources (humans and language data) and technology available to it. When we recognize that each stage in the consumer decision journey represents ongoing and continuously increasing translation demand, it becomes clear that the pressure for interoperability will keep growing.
The small globes at the center of Figure 1 point to a missed opportunity for most companies that are translating content for generic markets (for “Germans” and so on) rather than tuning content for the many preferences of their potential customers. Clearly, humans rather than machines are needed here. And yet humans need to be armed with the right information to target their work for specific audiences. This means that interoperability to ensure easy access to the right terms and language data also provides a foundation to enable us to translate for the many small worlds or linguaspheres on one large planet.
Numerous other industries have flourished as a result of standardization that led to interoperability, enabling automation and fostering innovation. Manufacturing industries have long appreciated the value of interchangeable parts to lower costs. Consumer goods companies have UPC barcodes. Financial services firms use agreed-upon symbologies and specific messaging formats for transactions. The adoption of well-defined standards fueled growth in these and other industries.
Two roads to interoperability
Traditionally, industry-wide interoperability has happened through the adoption of standards created by a consortium or standards organization, or through the de facto adoption of a market leader’s schema. So which of these two roads makes sense for the translation industry?
Many companies are now in the early stages of strategic changes to align their operating models to the nature of translation demand in the twenty-first century. Through consultation with our members we have identified several attributes that need to be changed. The prioritization of these changes inevitably depends on specific organizational needs. Figure 2 lists the attributes and highlights an overarching change attribute.
While the analysis of each change attribute requires quite some effort in itself, in the context of this article it’s important to note the overall direction and position of twenty-first century translation attributes — they are situated on the right side of the grid. The overarching character of change is a move away from closed models and toward open-collaborative models.
This has a pivotal impact on which of the two roads the industry takes. The open-collaborative side of the grid implies choice and flexibility. It implies easy integration up and down the supply chain, as users and buyers of translation technology make choices on what and how to translate from a wider range of content types and approaches than historically. It implies greater flexibility in choosing technologies, as twenty-first century translation attributes require the ability to make decisions in a more dynamic and innovation driven environment than in the past.
The scope of translation and localization is extended far beyond the content types and scenarios that existing industry standards are designed for. This new environment includes the shift to dynamic content, social media and multiscreen publishing, as well as delivering content to an array of new devices, most notably mobile.
Our conclusion is that interoperability is more likely to come from the adoption of standards created by consortia than from the dominance of a market leader. A market leader is not able to define standards for such a wide array of needs. The sheer breadth of the unfolding translation and localization interoperability challenge is beyond the capability of one player. However, anyone involved in standardization work in the industry can tell you that while there are various groups investigating aspects of the challenge, there isn’t really a coherent strategy for connecting all these efforts to ensure they are complete and complementary.
That said, from our earlier analysis of translation demand in the current consumer journey, we can also conclude that the translation industry is approaching a tipping point where the value of interoperability outweighs the cost of achieving it. The latent demand for ubiquitous translation and the complexity of achieving greater interoperability are bringing people together — people who previously found solutions on their own.
Where we are now
Last year, TAUS research showed that the lack of interoperability is already costing the industry a fortune. The survey response base of 115 companies included most of the largest translation buyers and many of the large service providers. Responses to the question “How much does the lack of interoperability cost your business?” are the most telling in the context of this article. Over one-third of respondents lost 10% of their income or more due to the lack of interoperability. Over 40% of respondents did not actually know the cost to their business (Table 1).
Several buyers and translation service providers said that lack of interoperability prevents them from switching vendors or translators when they need to. One service provider noted that “translators may refuse jobs because they don’t like the computer-assisted translation (CAT) tool requirement.”
A large buyer explained, “We saw a leveraging loss of more than 20% when we switched from one CAT tool to another using TMX for data migration. In order to try to reduce the loss, various resources had to work to put in workarounds. So, total cost due to the interoperability problem is a lot higher than what’s easily quantifiable.”
A technology provider added that “instead of simply supporting industry standards, as a technology provider we must also support other companies’ proprietary formats. Since there are no clearly defined or available standards for these formats, development is more time consuming.”
Translators and small translation agencies are frustrated by the lack of real choice in tools and the time wasted on conversion and retaining matches. Interoperability is welcome, but these groups have little influence on standardization efforts and are dependent on trade associations, which themselves have little global program management capability, to represent their stakeholders’ interests. The 40 or so large translation service providers in our industry usually have an ambivalent attitude toward interoperability, as it would increase competitive pressure for them.
Small translation buyers are used to booking flights and making global payments. They expect translation and localization to be just as frictionless. They are all in favor of interoperability, but have little ability to influence the workings of the industry.
Small tools providers realize that success in gaining market share depends on how well their products integrate and work together with other tools. They need to invest in compliance with interchange format standards for translation, but they struggle because the standards are not mature and they need to keep updating filters for many different file formats. They could be fanatical about interoperability standards, but investments are really high. Their influence is limited, so they usually decide that others should take the lead.
Large translation buyers have a somewhat more ambiguous relationship with the complex standards agenda compared to small buyers. They have invested so much in vendor relationships and technologies that in some ways they have become part of the problem. The original and most common issue is insisting on a specific tool in a multi-vendor environment rather than on the use of specific open standards. These buyers have already invested in workarounds for legacy interoperability problems. However, the lack of interoperability upstream with content management and social media management is now becoming painful. Large buyers have more power to influence things than most other stakeholders, but they need tools providers to cooperate.
The market leading tool vendor is supporting its users by constantly improving products, adding new features and moving upstream to provide content management systems (CMSs). It believes its customers’ need for interoperability is best served if they stick to its products. If on a rare occasion a customer needs a tool or a module from another provider, application programming interfaces (APIs) are provided and supported with integration. However, its offering has been built with acquisitions of workflow and CMSs, and it is proving increasingly challenging to ensure interoperability even in its own house. Its customers are pushing it to support open standards, and it is willing to do that, of course, but it’s also wondering which standard is the right standard and where to begin.
A few middleware providers have stepped in with pragmatic solutions to bridge the gap between immature and sometimes ill-defined standards. Their solutions are effective, but relatively costly for most.
Open standards are critical and must be extensible and flexible. Common concerns about adopting and implementing open standards have included the overhead of supporting unnecessary features, the perception of restricting creative freedom, the dilution of competitive advantage and the divulgence of proprietary information.
How we get to our destination
The changed demand for translation requires that we move on from the status quo. We propose a three-pronged approach. First, work with credible standards organizations to define open standards for the localization industry. There is a need to employ proper pro-
ject management approaches to define requirements and document specifications thoroughly and in a reasonable timeframe. This is a particular weakness of the OASIS XLIFF technical committee. That said, the recent addition of Microsoft and Oracle to the XLIFF technical committee points to a welcome trend of major buyers re-engaging with open standards creation rather than implementing their own workarounds.
Secondly, there is a need for grassroots activists. Activities move quickly and tend to be focused, demonstrating issues and identifying solutions. Examples include a group of competitor companies working under the banner of Interoperability Now to promote open technical exchanges, and Brian McConnell’s work on defining the interfaces for a standard RESTful API that builds on the original translation web services idea put forward by OASIS in the early 2000s. The risk is that their efforts never scale and that they fail to engage the main stakeholders — large buyers and other providers.
Thirdly, there needs to be a role for an organizing body or umbrella organization capable of leading the effort and monitoring compliance against interoperability initiatives. Such an organization, which has been missing from our industry, would fulfill the global program management office function needed to help ensure focus and prioritization of activity. It would also ensure complete and complementary activity among the main stakeholders. The ETSI-led interoperability test planned for this summer is a good, if partial, move in the right direction.
So where does TAUS sit in all this and what are we doing? With the support of our members, we have stepped up as an interoperability watchdog. This means that we will try to fill some of the functions of an umbrella organization, but our activity falls short of the full program management office. In the coming months we will introduce a newsletter intended to focus our attention in an overly complex landscape. We will soon publish a proposal for a common API for translation services that aims to complement other standards activity taking place in the industry. The aim is to identify and promote a web services architecture for the industry. On May 30, we will host a Translation Technology Providers Round Table meeting in Paris with around 40 CEOs and decision- makers from major tools providers. This first-of-a-kind event for our industry will provide a forum for a genuinely open exchange about interoperability and other issues important to technology providers. Looking further ahead, we are interested in advancing the state of semantic interoperability in the industry through the use of linked data. But that deserves a whole other article