•  
  •  
 

Abstract

Communication is a crucial for humans, it is most vital. People with hearing or speaking disabilities need a way to communicate with other people of the society and vice versa. This paper presents a novel methodology in classifying the English Alphabets shown via various hand gestures in The Indian Sign Language (ISL) using Mediapipe Hands API, launched by Google. The objective of using this API is to detect 21 landmarks in each hand along with their x, y and z coordinates in 3D space. Due to the scarcity of proper dataset available on the internet for ISL, at the very beginning, we have created a dataset having a size of 15000, per English character, each consisting of the coordinates of 21 landmarks recognized by Mediapipe Hands API. From the literature, we found that prediction has been done for The American Sign Language and other foreign sign languages using Mediapipe API effectively. The novelty of our proposed work lies in using the same API for the Indian Sign Language. In this paper, we have discussed a comparative analysis of different classification algorithms like Support Vector Machine (SVM), Random Forest, Knearest neighbors (KNN) , Decision Tree and other algorithms in terms of accuracy with the highest accuracy among all being 99%. It is relevant to mention in this connection that the classification of the Indian Sign Language (ISL) using Mediapipe API is faster than the other conventional methods and outperforms in computational capability. This model can be used in web applications, mobile applications, desktop applications and in many more places.

Share

COinS