Hearing is one of the most essential human senses, yet not everyone possesses this gift. According to the World Health Organization, approximately 360 million people worldwide are affected by the disability of hearing loss. Sign language is typically used to aid communication for deaf and hearing-impaired people. Sign Language involves use of signs and gestures to convey communication. American Sign Language (ASL) alphabet refers to 26 finger-spelled letters including 24 static ones and 2 dynamic one’s ‘J’ and ‘Z’ . However, not everyone knows this sign language, thus, it has become harder for deaf people to communicate with others. The lack of a written representation for American Sign Language (ASL) makes it difficult to do something as commonplace as looking up an unknown word in a dictionary. The majority of printed dictionaries organize ASL signs (represented in drawings or pictures) based on their nearest English translation. So, unless one already knows the meaning of a sign, dictionary look-up is not a simple proposition.
Gesture recognition can be defined as the identification of meaningful expression or motion performed by a human being, which involves the movement of hand, arm, face, head and body. Recognition of American Sign Language (ASL) alphabet and gestures could not only bring benefits to the ASL users, but also could provide solutions for natural human-computer/robot interactions in many applications. In this paper, we propose a method for ASL gesture recognition and interpretation to text with use of machine learning.
The complexity to learn and unpopularity of use of sign language and gestures for communication makes it hard for normal people to communicate with the deaf. The lack of a written representation for American Sign Language (ASL) makes it difficult to do something as commonplace as looking up an unknown word in a dictionary. The majority of printed dictionaries organize ASL signs represented in drawings or pictures based on their nearest English translation. So, unless one already knows the meaning of a sign, dictionary look-up is not a simple proposition. This therefore presents a need for an easy and understandable medium of communication between the deaf, dumb and normal people.
1. To study the existing systems through data collection, data analysis and generate requirements.
2. To design the architecture of the sign language interpretation system.
3. To implement the real-time sign language interpretation system.
4. To test sign language inputs and output interpretation of the system through verification and validation.
15/U/7566/PS
15/U/3044/PS
15/U/13323/PS
15/U/4965/PS