A PROJECT REPORT ON SIGN LANGUAGE TRANSLATOR

Authors

  • Duvvalla Rahul Raj Author
  • Mrs. M Anusha Author

Keywords:

scenarios such as education,, healthcare,, mobile computing and AI,

Abstract

The Sign Language Translator mobile application aims to facilitate seamless communication 
between individuals using sign language and those unfamiliar with it. The core of the system leverages 
computer vision and machine learning algorithms to accurately recognize and translate sign language 
gestures into readable text or audible speech in real time. The application captures hand movements using 
a smartphone camera, processing them through deep learning models trained on diverse datasets of sign 
language gestures. 
The system identifies the position, orientation, and motion of the hands, analysing them frame by 
frame to construct accurate translations of words or sentences. It supports various sign languages, offering 
a flexible and inclusive communication tool for different user needs. The application also incorporates a 
user- friendly interface designed for ease of use, providing features such as customizable language 
preferences and real-time feedback on gesture recognition accuracy. 
This project aims to address the communication barriers faced by individuals with hearing or 
speech impairments, promoting inclusivity in both personal and professional interactions. By leveraging 
advancements in mobile computing and AI, the application offers an innovative solution that can be used 
in real-world scenarios such as education, healthcare, and daily communication

Downloads

Published

02-09-2024

How to Cite

A PROJECT REPORT ON SIGN LANGUAGE TRANSLATOR . (2024). International Journal of Engineering Research and Science & Technology, 20(4), 35-41. https://ijerst.org/index.php/ijerst/article/view/444