Sign Language Interpreter Using HPR-SLI
The deaf and dumb people widely use sign language for communication; however, they find it difficult to communicate with people who don’t understand Sign Language. In this paper, a Hand Pose Recognition-based sign language Interpreter (HPR-SLI) system is proposed, which translates sign language using Natural Language Processing (NLP). Sign languages are expressed by hand and come in combination with elements such as postures, body movements, head, eyebrows, eyes, face and mouth, which are used in different combinations to present several pieces of information. The HPR-SLI system enables consecutive sign languages to be translated into readable English. A hand pose recognition method based on an Artificial Neural Network (ANN) and a language processing system based on a Text-to-Text Transfer Transformer (T5) model which is assisted by a tokenizer. The proposed system can be deployed as a mobile application or into a physical device designed specifically for the intended purpose. The whole idea of the paper is to design and develop a system that significantly lowers the communication gap between the speech-hearing impaired and the normal world. The technology HPR-SLI aims to help PwDs (Person with Disabilities) overcome most of their disadvantages and enable them to communicate and develop better social relationships to lead a normal life in the society.
Full Text Attachment