Real-time Hand Gesture Detection and Recognition for Human Computer Interaction

FieldValue
dc.contributor.authorDardas, Nasser Hasan Abdel-Qader
dc.date.accessioned2012-11-08T14:51:08Z
dc.date.available2012-11-08T14:51:08Z
dc.date.created2012
dc.date.issued2012
dc.identifier.urihttp://hdl.handle.net/10393/23499
dc.identifier.urihttp://dx.doi.org/10.20381/ruor-6192
dc.description.abstractThis thesis focuses on bare hand gesture recognition by proposing a new architecture to solve the problem of real-time vision-based hand detection, tracking, and gesture recognition for interaction with an application via hand gestures. The first stage of our system allows detecting and tracking a bare hand in a cluttered background using face subtraction, skin detection and contour comparison. The second stage allows recognizing hand gestures using bag-of-features and multi-class Support Vector Machine (SVM) algorithms. Finally, a grammar has been developed to generate gesture commands for application control. Our hand gesture recognition system consists of two steps: offline training and online testing. In the training stage, after extracting the keypoints for every training image using the Scale Invariance Feature Transform (SIFT), a vector quantization technique will map keypoints from every training image into a unified dimensional histogram vector (bag-of-words) after K-means clustering. This histogram is treated as an input vector for a multi-class SVM to build the classifier. In the testing stage, for every frame captured from a webcam, the hand is detected using my algorithm. Then, the keypoints are extracted for every small image that contains the detected hand posture and fed into the cluster model to map them into a bag-of-words vector, which is fed into the multi-class SVM classifier to recognize the hand gesture. Another hand gesture recognition system was proposed using Principle Components Analysis (PCA). The most eigenvectors and weights of training images are determined. In the testing stage, the hand posture is detected for every frame using my algorithm. Then, the small image that contains the detected hand is projected onto the most eigenvectors of training images to form its test weights. Finally, the minimum Euclidean distance is determined among the test weights and the training weights of each training image to recognize the hand gesture. Two application of gesture-based interaction with a 3D gaming virtual environment were implemented. The exertion videogame makes use of a stationary bicycle as one of the main inputs for game playing. The user can control and direct left-right movement and shooting actions in the game by a set of hand gesture commands, while in the second game, the user can control and direct a helicopter over the city by a set of hand gesture commands.
dc.language.isoen
dc.publisherUniversité d'Ottawa / University of Ottawa
dc.subjectPosture recognition
dc.subjectGesture recognition
dc.subjectScale Invariant Feature Transform (SIFT)
dc.subjectK-means
dc.subjectBag-of-features
dc.subjectSupport Vector Machine (SVM)
dc.subjectHuman-computer interaction
dc.titleReal-time Hand Gesture Detection and Recognition for Human Computer Interaction
dc.typeThesis
dc.faculty.departmentScience informatique et génie électrique / Electrical Engineering and Computer Science
dc.contributor.supervisorPetriu, Emil
dc.contributor.supervisorEl Saddik, Abdulmotaleb
dc.embargo.termsimmediate
dc.degree.namePhD
dc.degree.leveldoctorate
dc.degree.disciplineGénie / Engineering
thesis.degree.namePhD
thesis.degree.levelDoctoral
thesis.degree.disciplineGénie / Engineering
uottawa.departmentScience informatique et génie électrique / Electrical Engineering and Computer Science
CollectionThèses, 2011 - // Theses, 2011 -

Files