Ru, J.T.S. and Sebastian, P. (2023) Real-Time American Sign Language (ASL) Interpretation. In: UNSPECIFIED.
Full text not available from this repository.Abstract
Majority of the public do not know how to communicate in sign language. This forms a communication barrier between the public and the deaf community. Thus, a method to bridge communication is required. In this paper, a computer vision solution is proposed with the use of MediaPipe library and Long Short-Term Memory (LSTM) algorithm to identify twelve (12) motion based American Sign Language (ASL) phrases. At present, there is no publicly available sign language dataset that contains motion based phrases with hundreds of videos per phrase. Thus, this paper aims to collect a motion based ASL phrases dataset for future development of sign language interpretation. The MediaPipe and LSTM model has been trained up to 97 train accuracy and 79 test accuracy. Finally, real-time deployment of the deep learning model is achieved by deploying it to a Flutter-based mobile application. © 2023 IEEE.
Item Type: | Conference or Workshop Item (UNSPECIFIED) |
---|---|
Additional Information: | cited By 0; Conference of 2nd IEEE International Conference on Vision Towards Emerging Trends in Communication and Networking Technologies, ViTECoN 2023 ; Conference Date: 5 May 2023 Through 6 May 2023; Conference Code:190031 |
Uncontrolled Keywords: | Mobile computing; Technology transfer, American sign language; Bridge communications; Communication barriers; Deep learning; Mediapipe; Memory algorithms; Mobile applications; Real- time; Sign language; Sign language interpretation, Long short-term memory |
Depositing User: | Mr Ahmad Suhairi UTP |
Date Deposited: | 04 Jun 2024 14:11 |
Last Modified: | 04 Jun 2024 14:11 |
URI: | https://khub.utp.edu.my/scholars/id/eprint/19192 |