I spoke recently at UA Europe 2010. Ostensibly, a conference about software user assistance, localization was, naturally, featured. It was so encouraging to see the Irish software industry represented in the shape of the Alchemy Software Development stand in the exhibition space, and everybody had great things to say about MadCap Software’s Lingo product.
Naturally, some of the speakers mentioned localization in their presentations too. I have to say, a lot of what I heard was not very positive. One of the more colorful speakers described the localization process as nasty, expensive, messy and generally one big pain (and coming from someone working in user assistance, that really is something). Another speaker said his company now preferred to hire internal translators rather than have LSPs coordinate their translation efforts across EMEA and APAC.
Generally, on hearing all this, I was reminded of the state of internally-driven innovation (sorry but I regard Google Translate and crowdsourcing as external) in the localization space and the lack of a pervasive thought leadership (there are some exceptions, naturally). When I spoke with conference delegates, it seemed a major part of the problem is the continued disengagement of localization, translation, whatever you like to call it – I prefer global information delivery – from the user experience. Localization (actually user assistance too) is too often couched in narrow “back-end” terms of content management systems, workflows, professional linguists, and invisible project management. Because that’s what it probably is, for many.
And then there are translation memories (TMs). I am surprised some of these things haven’t been added to the list of world heritage sites by the UNESCO World Heritage Centre given the sanctity they’ve been accorded. Anyone who has ever been told they cannot change or update source content because it will upset translation memory matching will relate to this back-end mentality (translation memories are, in fact, user experience liabilities in this regard).
Another thought leader, respected industry commentator Renato Beninatto, argues that TMs will be free or irrelevant within five years. Bring it on, I say. After bootstrapping a statistical machine translation engine you might as well give them away in my opinion, probably with a health warning given what we know about dirty data.
Generally, for both user assistance and translation I was left with the feeling that this “back-end” fundamentals approach needs to change to a “front-end” information quality perspective; one where the terminology and language of users (or their “conversation”, if you like) become the real information assets and decision-making arbiter. Source content creators become curators, facilitators, of user decisions about information creation and quality. Of course, this must all line up with business requirements if you’re out to make a profit, but I don’t see how the front-end approach conflicts with delivering quality information globally, increasing user engagement, market share, and the bottom line.
Generally, the way forward in my opinion to turn the back end into the front end and bring the consumers and creators of translated information (source or user generated) to center stage.
Let’s see some more thought leadership on this subject from the GILT industry and the localization conference scene opening up to people who bring fresh perspectives from outside the industry.