My Computer Reads Hand Signs! A Computer Vision Application to Detect Hand Gestures in American Sign Language
Format
Oral Presentation
Faculty Mentor Name
Venkittaraman Pallipuram Krishnamani
Faculty Mentor Department
Electrical and Computer Engineering
Abstract/Artist Statement
This research presents a computer-vision application to automatically recognize the hand gestures made using the American Sign Language (ASL). The primary focus of this application is to detect hand signs for English alphabet letters, excluding J and Z. To achieve successful hand gesture recognition, the application is broken into three image-processing stages. The first stage inputs the hand gesture image and separates the hand gesture from the background via background subtraction. The second stage executes the connected components algorithm on the hand gesture to annotate the interesting details. The third and final stage performs the principal component analysis (PCA) on the interesting details to generate shape-specific statistical parameters including the covariance matrix, eigen-values, and eigen-vectors. These statistical parameters are used to train the computer to recognize specific hand gestures. Our preliminary implementation successfully detects English alphabet letters A and C with 0% error, while other letters require additional tuning for accuracy. We aim to extend our application to include artificial neural networks (ANNs) to perform recognition for a variety of hand signs. The future work also includes accurate, real-time analysis of hand signs, classification of hand signs that incorporate hand movements, and development of a complete user-friendly interface for the application.
Location
DeRosa University Center, Room 211
Start Date
29-4-2017 10:00 AM
End Date
29-4-2017 10:20 AM
My Computer Reads Hand Signs! A Computer Vision Application to Detect Hand Gestures in American Sign Language
DeRosa University Center, Room 211
This research presents a computer-vision application to automatically recognize the hand gestures made using the American Sign Language (ASL). The primary focus of this application is to detect hand signs for English alphabet letters, excluding J and Z. To achieve successful hand gesture recognition, the application is broken into three image-processing stages. The first stage inputs the hand gesture image and separates the hand gesture from the background via background subtraction. The second stage executes the connected components algorithm on the hand gesture to annotate the interesting details. The third and final stage performs the principal component analysis (PCA) on the interesting details to generate shape-specific statistical parameters including the covariance matrix, eigen-values, and eigen-vectors. These statistical parameters are used to train the computer to recognize specific hand gestures. Our preliminary implementation successfully detects English alphabet letters A and C with 0% error, while other letters require additional tuning for accuracy. We aim to extend our application to include artificial neural networks (ANNs) to perform recognition for a variety of hand signs. The future work also includes accurate, real-time analysis of hand signs, classification of hand signs that incorporate hand movements, and development of a complete user-friendly interface for the application.