EconPapers    
Economics at your fingertips  
 

UTILIZING CONVOLUTIONAL NEURAL NETWORKS TO COMPREHEND SIGN LANGUAGE AND RECOGNIZE EMOTIONS

Chinu Singla, Wala Bin Subait, Hany Mahgoub, Abdulsamad Ebrahim Yahya, Muhammad S. A. Alzaidi, Muskaan Munjal, Wafa Sulaiman Almukadi and Nader Mohammad Aljawarn
Additional contact information
Chinu Singla: Department of Computer Science and Engineering, Punjabi University, Patiala, Punjab, India
Wala Bin Subait: ��Department of Language Preparation, Arabic Language Teaching Institute, Princess Nourah bint Abdulrahman University, P. O. Box 84428, Riyadh 11671, Saudi Arabia
Hany Mahgoub: ��Department of Computer Science, Applied College at Mahayil, King Khalid University, Abha, Saudi Arabia
Abdulsamad Ebrahim Yahya: �Department of Information Technology, College of Computing and Information Technology, Northern Border University, Arar, Saudi Arabia
Muhammad S. A. Alzaidi: �Department of English Language, College of Language Sciences, King Saud University, P. O. Box 145111, Riyadh, Saudi Arabia
Muskaan Munjal: ��Department of Computer Engineering, Thapar Institute of Engineering and Technology, Patiala, Punjab, India
Wafa Sulaiman Almukadi: *Department of Software Engineering, College of Engineering and Computer Science, University of Jeddah, Jeddah, Saudi Arabia
Nader Mohammad Aljawarn: ��†Department of Business Administration & HR, Faculty of Business, Jadara University Irbid, Jordan

FRACTALS (fractals), 2024, vol. 32, issue 09n10, 1-13

Abstract: The inability to communicate verbally is widely acknowledged as a significant disability. The primary objective of this research is to create a practical system aimed at individuals with hearing impairments, particularly those who depend on sign language as their primary means of communication. This system aims to enable deaf individuals to express themselves, communicate effectively and facilitate understanding of their language which would otherwise be challenging since most people are unfamiliar with sign language. By employing human gesture interpretation and motion capture, this technology can facilitate the translation of sign language into spoken language and vice versa. Despite the existence of various methods to convert sign language into voice, none of them currently provide an entirely intuitive user interface. Our objective is to create a system that not only translates sign language but also integrates a natural user interface, thus enhancing accessibility for individuals who are blind or have visual impairments. This system will achieve this by recognizing facial expressions and effectively conveying emotions behind words assisting visually impaired individuals in expressing themselves more effectively. Hence, in this project, we tentatively aim to build a system that can ease the lives of blind, deaf and dumb people to some extent. Normal communication like normal people do might never be possible for differently abled people but through our project, we try to provide them with a tool that can help them experience normalcy while communicating with normal people.

Keywords: Image-Recognition; Motion Capture; Natural Language Processing; Sign Language Converter; Voice Recognition (search for similar items in EconPapers)
Date: 2024
References: Add references at CitEc
Citations:

Downloads: (external link)
http://www.worldscientific.com/doi/abs/10.1142/S0218348X2540016X
Access to full text is restricted to subscribers

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:wsi:fracta:v:32:y:2024:i:09n10:n:s0218348x2540016x

Ordering information: This journal article can be ordered from

DOI: 10.1142/S0218348X2540016X

Access Statistics for this article

FRACTALS (fractals) is currently edited by Tara Taylor

More articles in FRACTALS (fractals) from World Scientific Publishing Co. Pte. Ltd.
Bibliographic data for series maintained by Tai Tone Lim ().

 
Page updated 2025-03-20
Handle: RePEc:wsi:fracta:v:32:y:2024:i:09n10:n:s0218348x2540016x