Our history books are extensively filled with the origins of spoken language in different parts of the world; we map where each language travelled and how it became anew; how they mixed and merged, and new scripts were created. But what about non-verbal communicators who don’t depend on spoken languages? A language of hand movements and facial expressions, what exactly is its significance and for whom?
A Little Bit Of History and A Little Bit Of Its People
You would be surprised to know that the origins of sign language haven’t really been recorded due to its nature, but researchers do believe it emerged as a tool of communication among tribes, especially for the purpose of hunting. What might have started as basic symbols, developed into complex signs to express ideas and thoughts to each other and beyond one’s own group. One of the best examples of this is the Plains Sign Talk used by Native American tribes as a link language for tribes from the Southern part (of North America) to trade with its upper regions. Later in this region, post colonization, developed the arguably most well known sign language today - the American Sign Language or ASL. It originated in the first Deaf school in the USA, birthed from the French Sign Language (LSF) but transforming into a unique combination of local home signs and LSF.
In the present era, there are estimated to be around 300 known sign languages, spoken both by deaf and hearing communities (mostly the family members who have deaf or mute relatives). Though sign language is an integral part of the Deaf community, it’s not used by each one of the 70 million deaf people on this planet. Being a member of the Deaf community is determined by a common set of beliefs and shared experiences - a Deaf culture that may not be experienced by all deaf people.
Signing Our Way To Animals
The utility of sign language crosses into the world of animals. We know that animals communicate amongst themselves using vocalisations and specific body movements, but can we find a better way to make them understand us through the language we have created?
In 1966, when the Gardners decided to teach a baby chimpanzee American Sign Language; it opened up a world of possibilities for further research on animals and their psyche. Washoe, the chimpanzee, learnt 350 signs during the whole project and even taught another chimpanzee ASL without being directed to do so. She formed close relationships with humans and showed a surprising level of empathy, expanding our knowledge of primates and our outlook on animals in general.
This indicates that sign language can potentially bridge the gaps between humans and species that share a similar level of intelligence and abilities; by not relying on operant conditioning that limits their capabilities but by introducing sign language as the sole form of communication (similar to how a human child would be raised by deaf parents).
If Washoe, a sentient creature that was the result of evolution, learnt signs even when not taught; can our mechanical creations go beyond specific spoken language and learn to understand the complexities posed by sign language?
Machines Meet Sign Language: Efforts for the Deaf and Mute Community
Technology to assist deaf persons was the starting point in inventions that incorporated sign language. In 1964, three deaf innovators, Robert Weitbrecht, James Marsters and Andrew Saks developed the Teletypewriter (TTY), which sent typed messages over phone lines to be received on the other side. Unfortunately, it was not the perfect solution for ASL users who could not type or did not know the text language.
A concerning problem that arrived with most assistive technologies is that a lot of them still rely on deaf and mute people’s ability to text over a phone but not to incorporate sign language in the mainstream. Sign language is vital to the identity of the Deaf and Mute community, and to acknowledging their existence in the world of the hearing majority.
Thus, enters the Video Relay Service (VRS) to ease communication between sign language users and users who only understand spoken language. The VRS uses a Communications Assistant to convert the signed messages through video, to a standard caller. But while the widely used video-conferencing applications provide an easy solution for most, it's not convenient when on the go; one can't always have an interpreter present to help.
Just for this, Google’s revolutionary hand-tracking technology (which is free) is enabling creators to make real time sign language translators. Following the same formula, a Kenyan software engineer developed Sign-IO, a glove for sign language users. With 93% accuracy, the sensors present in his gloves detect hand movements and convert them into audio through a bluetooth, enabling communication between hearing and deaf persons. On the other side of the globe, a Netherlands based company created GnoSys, a translator app that uses only a camera to translate messages as quickly and efficiently as possible. GnoSys aims to help employers reach out to the Deaf community and provide better working conditions.
A Mechanistic Future - Accessibility or Erasure?
Would new technology, sometime in the future, completely erase the need for sign language if we find better ways to communicate? Well it all depends on what we set our goals to be. Our existence is marked by our nature to form connections; how we work in groups and innovate together. Physical communication through our bodies - using hands, expressions and sounds is intrinsic to us - thus, any commonly used language becoming obsolete is unlikely.
Sign language too shouldn’t face this problem - if our aim is to find new ways to advance assistive technologies rather than finding ways to erase differences by making a standard language. Admittedly, hoping for a better world is just a dream for most nations in our world, as the prime requirement for this technology is top quality information and communication infrastructure.
Currently, only around 41 countries recognise sign language as an official language, and amongst those 41, only a few nations provide satisfactory access to it. An example of this would be Kenya, where the Kenyan Sign Language was made official with the 2010 Constitution amendment. Even after promising to promote its use through Article 7(3)(b), Kenya still sees a lack of qualified interpreters and limited access. Other nations that have not given sign language an official recognition, simply don’t give a reason for their inadequate inclusivity. Like the Indo-Pakistani Sign Language, used by approximately 7 million people in the Indian Subcontinent, sees no official recognition yet in Pakistan, India or Bangladesh (thus, no monumental action on making it accessible). Fortunately, in India at least the discourse is alive, with an ongoing petition by the National Association of Deaf to make Indian Sign Language official.
Adopting inclusive infrastructure, language, policies and attitudes ought to be built into the foundation of a society. Nations giving their sign languages an equal place with spoken languages and making it available as an elective in schools will raise a generation of people who are aware of its significance. After all, it should be our priority to make the world just a little bit more comfortable for people who don’t “fit in” the norm, and create an equal playing field for every resident of this planet.