The future of technology in ASL translation

As we all know by now, technology is constantly changing the way we interact with other people and the world around us. Machine translation has been around for many years, and its reliability and accuracy is improving all the time. The same sort of technology has even begun to be applied to spoken language, with translator apps available on many smartphones. Of course, the hit-or-miss accuracy and reliability of these technologies means that they are no substitute for professional human translation, and are unlikely to become so, but they provide examples of how technology has begun to explore language and can provide valuable tools to those who are unable to access or afford professional services.

Recent developments in the world of motion capture technology have opened up the same exciting new possibilities in the world of sign language interpretation. Microsoft’s Kinect is perhaps the most widely known example of a modern application of motion capture as a method of interaction, but the quick and precise movements involved in sign language have meant that it is beyond the reach of motion capture devices to process. However, this is something that has begun to change over the last year. Michael Kan reported late last year in PC World that Microsoft is exploring how to use Kinect and the Xbox 360 to “read sign language from deaf users.” AcceleGlove, a glovelike gesture capture device, has also been suggested for this purpose, although its pricepoint so far has been steeper than the Kinect.

Leap Motion is another example of these exciting new possibilities. It is a virtual reality device released last year that is designed to allow for haptic computer control in three dimensions using proprietary motion capture sensors. Although it has not been explicitly designed for American sign language (ASL) use, it has been tapped for possible application interpretation due to its ability to accurately represent human hands in three dimensions and in real time. Leap Motion tracks users’ fingers with placement accuracy of up to one hundredth of a millimeter. Although this type of technology has been created before, Leap Motion is the first product that has both the caliber of sensitivity and the real-time processing capabilities to show potential for sign language interpretation. A price point targeted at the individual consumer rather than business-to-business sales doesn’t hurt either, with a typical unit costing roughly around $80 total.

So far, despite its potential impact on the signing community, most mentions of Leap Motion and its possible implementations for ASL have been constrained to the tech industry and its publications.

 

Leap Motion possibilities for the ASL community

Despite having only been released to the public last year, the development community of Leap Motion has been actively working toward ASL applications. Although there are a number of different projects working to integrate the Leap Motion technology into suitable form factors and create software support, as of yet there are no products on the market due to the complexities involved in perfecting this new concept. The vast majority of these projects can be placed under several categories. 

Improving communication in everyday life is an obvious one. Signing has some drawbacks; from trying to mail a package at the post office to ordering a drink at a coffee shop, communicating with people outside of intimate circles with signing knowledge has its challenges. Currently, most people in the deaf and hard-of-hearing community express themselves by writing messages with pen and paper or typing them out on their mobile phones. However, the deaf community has expressed feelings of frustration with how laborious this task becomes.

The possible applications of Leap Motion to address this need have been varied. One possibility was to encourage the installation of Leap Motions at help desks and cash registers, where an ASL speaker could sign over the Leap Motion and have his or her translated message appear to the non-ASL speaker. Another application places a modified version of the Leap Motion directly on the person of the signer, and this device would then display a message automatically.

Additionally, people looking to learn ASL can run into a number of problems. First of all, if a user lives in an area without a large ASL community, it can be difficult to learn ASL in isolation. Although there are books, diagrams and learning videos, there has been a longstanding need for improved study tools.

Leap Motion developers have been utilizing the device’s ability to show signs in full three-demensional mode, which can be viewed and manipulated as the user wishes. Besides providing realistic reference images, a user could practice signing with the Leap Motion and have his or her mistakes corrected. This is a level of self-learning that was not feasible for most users who did not have access to classes. 

A related but separate problem is when individuals are living in an area with a different dialect of sign language than they are speaking. For example, an English speaker could be living in the United Kingdom for a year before returning to America. During that time, the American would be relegated to working with other deaf people who would use a different signing dialect than the one he or she would have been using for the majority of his or her life. Again, this is where Leap Motion and ASL learning software would be able to address the needs of the individual.

There are a number of obstacles with Leap Motion technology for ASL, however. In the current iteration of Leap Motion, a user’s hands are not shown in relation to the rest of the body; a factor that is vital in proper expression of ASL. As of now, there is no support for showing the hands in relation to the body in Leap Motion, but it has been mentioned that it is being considered for future iterations and updates for the device.

Another issue is that Leap Motion cannot see through solid objects, and this includes other fingers. This could potentially affect the readability of certain signs in which fingers overlap one another, and consequently block the reading of the rest of the hand.

All in all, despite having only been released last year, the developer community of Leap Motion has shown considerable promise. For example, late last year a Sydney-based team comprised of Catarina Araujo and Sofia Santos started the development of a modified Leap Motion device that would be worn around the neck of the mute or deaf individual (Figure 1). The currently unnamed device would translate gestures it sees and would display them on its face for easy readability by the conversational partner. The device is expected to enter production in the next few years. This team is just one of many that are actively incorporating Leap Motion into sign language based applications, which will be the first of many to have an affordable price point targeted toward the general public.