Sign language and technology

What are the intellectual and technological breakthroughs that have liberated the Deaf community from isolation?

The earliest reference to the Deaf using Sign is in Plato’s Cratylus c.350 BC (thanks to David Buxton and Jeff McWhinney for pointing this out).

Benedictine monks used signs to communicate during vows of silence. Most of these were food related, it was a lexicon rather than a language as it had no unique grammar.

It had the same word order as speech and the largest known vocabulary was 472. The monks also communicated by whistling, not available for the Deaf.

Divers and factory workers have also used signs to communicate, but the first proper Sign languages emerged in cities like London and Paris. Here, the Deaf could form a community unlike in villages where they were very unlikely to meet other signers.

1880-1960 was a dark age when Sign was nearly banned from schools. Two inventions would start to end Deaf isolation.

First was the teletypewriter in 1964. This meant the Deaf could have telephone conversations by typing. Today of course, texting and emailing has rendered the teletypewriter obsolete.

The telephone itself had been seen by Alexander Graham Bell (not the inventor of the telephone as he claims credit for) as a way of assisting Deaf communication.

Second were the first closed captions on TV in 1972. YouTube started adding closed captions to videos in 2010.

“The French Chef” was the first TV broadcast to use closed captioning in 1972

Oliver Sacks wrote Seeing Voices in 1988. He feared that the teletypewriter would damage Deaf culture and Sign:

“Before they were widely available, fifteen years ago, deaf people went to great lengths to meet each other—they would constantly visit each other’s homes, and would go regularly to their local deaf club. These were the only chances to talk with other deaf people; this constant visiting or meeting at clubs formed vital links which bound the deaf community into a close physical whole.

Now, with TTYs (in Japan, faxes are used), there is much less actual visiting among the deaf; deaf clubs are starting to be deserted and empty.”

Hopefully this is no longer the case because of the rise of video calling and videoconferencing.

Most smartphones have front facing cameras. Smartphones and tablets are cheap enough for most people. It should be possible for the Deaf to use Sign over Skype, FaceTime etc.

One more goal is for signs to be translated into words by computers in the same way voice to text software works.  Sign interpreters may be rendered obsolete by software.

Ryan Patterson

In May 2008, a group of engineering students at Carnegie Mellon University built HandTalk. This was a glove made of very cheap parts with five sensors that converted hand gestures into 65,500 possible digital data points transmitted to a mobile via Bluetooth. An app called Java 2 Micro Edition translates the data into text and speech.

Today, we still have not perfected sign to speech translation. Voice to text has proved far easier as it is a one dimensional language while Sign is four dimensional with far more variables. But, given that Moore’s law means computers double in speed every two years, anything is possible.


Article updated: 13/04/2018 9:00am


Disclaimer: BDN is not responsible for views expressed by journalists/writers and welcomes comments that open up a debate, meaningful dialogue, emendation and anything else that we all can benefit.

Edmund West is an autistic freelance journalist who has been writing articles since 2007. He also works with Autistic adults and has an MA in history. He has written for several magazines: Press Gazette, Wired, Military History Monthly, History Today, etc.