Mood gear shift

I remember a language tech product that came out in France in the late 1980s designed to evaluate ‘attitude’ in documents. It used a dictionary to look up word connotations, and by combining connotations into attitude clusters, tried to identify the emotional stance – in fact the negative/ positive quotient – beating at the heart of document collections. It then used color coding of text to point out those chunks of text that exhibited the symptoms. I’m exaggerating a bit, frankly: the product actually started off as an automatic summarizer, but the inventor realized that the attitude of a document, its affective position on entities contained in it, were part of the ‘meaning’ and needed to be included in a good summary. Never heard of it again.

Today there are various technology initiatives underway designed to tap and exploit the emotional content of interactions, and possibly even of texts. At the most basic level, this is embodied in the sort of text mining found in the Reuter’s

Factiva project on ‘corporate reputation’. IBM’s WebFountain has now been dropped as the key technology, but the idea is to identify critical or laudatory remarks in say product reviews, or in analysts’ reports of M&As and so on to track reputations. You would need simple grammars of disdain and praise per language, and some sort of method of weighing up competing shades of attitude in a given search. Wanna know how X’s previous appointments went down in the trade press? Just mine the blogs and the opinion columns. Not perhaps affective computing, but the technology could have a field day with some of those spluttering me-too blogs or star-spangled SMS messages.

Human speech is obviously a much more sensitive indicator of personal emotional states, so it’s nice to see speech recognition technology getting another break here. This report shows how a Scottish firm Affective Media is using speech to identify car drivers’ feelings of road rage or drowsiness so that in-car solutions can be used to reduce them.

Affective Media chief executive Christian Jones said prototypes were being fitted to trial vehicles and claimed the system could be a life-saver. “Studies show unhappy or angry drivers are more prone to accidents than drivers who are relaxed,” he said. “Our technology will work with any voice recognition software. In the future, more cars will have voice-activated controls. This technology will sample the voice to tell if a person is angry or frustrated and will then act accordingly.

Alun Parry, spokesman for Toyota, said the company planned to test emotion-detecting technology in its experimental “Pod” cars. “We want a car to respond to the emotion of the driver and, as well as the voice technology, the Pod will monitor the driver’s pulse and could act to slow the car if it senses that the driver is being erratic or going too fast,” he said.

Andrew Joscelyne
European, a language technology industry watcher since Electric Word was first published, sometime journalist, consultant, market analyst and animateur of projects. Interested in technologies for augmenting human intellectual endeavour, multilingual méssage, the history of language machines, the future of translation, and the life of the digital mindset.


Weekly Digest

Subscribe to stay updated

MultiLingual Media LLC