AI Could Advance Inclusivity by Interpreting Sign Language in Real Time

Posted by
Check your BMI

Researchers in the US have conducted a first-of-its-kind study focused on recognising American Sign Language alphabet gestures using computer vision.

 

Copyright: htworld.co.uk – “AI Could Advance Inclusivity by Interpreting Sign Language in Real Time”


 

SwissCognitive_Logo_RGB

toonsbymoonlight
The research could play a key role in breaking down communication barriers and ensuring more inclusive interactions.

The researchers developed a custom dataset of 29,820 static images of American Sign Language hand gestures.

Using MediaPipe, each image was annotated with 21 key landmarks on the hand, providing detailed spatial information about its structure and position.

These annotations played a critical role in enhancing the precision of YOLOv8, the deep learning model the researchers trained, by allowing it to better detect subtle differences in hand gestures.

Results of the study reveal that by leveraging this detailed hand pose information, the model achieved a more refined detection process, accurately capturing the complex structure of American Sign Language gestures.

Combining MediaPipe for hand movement tracking with YOLOv8 for training, resulted in a powerful system for recognising American Sign Language alphabet gestures with high accuracy.

First author Bader Alsharif is a Ph.D. candidate in the Florida Atlantic University (FAU) Department of Electrical Engineering and Computer Science.

The researcher said: “Combining MediaPipe and YOLOv8, along with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach.

“This method hasn’t been explored in previous research, making it a new and promising direction for future advancements.”

Findings show that the model performed with an accuracy of 98 per cent, the ability to correctly identify gestures (recall) at 98 per cent, and an overall performance score (F1 score) of 99 per cent.

It also achieved a mean Average Precision (mAP) of 98 per cent and a more detailed mAP50-95 score of 93 per cent, highlighting its strong reliability and precision in recognising American Sign Language gestures.[…]

Read more: www.htworld.co.uk

Der Beitrag AI Could Advance Inclusivity by Interpreting Sign Language in Real Time erschien zuerst auf SwissCognitive | AI Ventures, Advisory & Research.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments