Repository logo

Visual Recognition of a Dynamic Arm Gesture Language for Human-Robot and Inter-Robot Communication

dc.contributor.authorAbid, Muhammad Rizwan
dc.contributor.supervisorPetriu, Emil
dc.date.accessioned2015-08-28T17:39:15Z
dc.date.available2015-08-28T17:39:15Z
dc.date.created2015
dc.date.issued2015
dc.degree.disciplineGénie / Engineering
dc.degree.leveldoctorate
dc.degree.namePhD
dc.description.abstractThis thesis presents a novel Dynamic Gesture Language Recognition (DGLR) system for human-robot and inter-robot communication. We developed and implemented an experimental setup consisting of a humanoid robot/android able to recognize and execute in real time all the arm gestures of the Dynamic Gesture Language (DGL) in similar way as humans do. Our DGLR system comprises two main subsystems: an image processing (IP) module and a linguistic recognition system (LRS) module. The IP module enables recognizing individual DGL gestures. In this module, we use the bag-of-features (BOFs) and a local part model approach for dynamic gesture recognition from images. Dynamic gesture classification is conducted using the BOFs and nonlinear support-vector-machine (SVM) methods. The multiscale local part model preserves the temporal context. The IP module was tested using two databases, one consisting of images of a human performing a series of dynamic arm gestures under different environmental conditions and a second database consisting of images of an android performing the same series of arm gestures. The linguistic recognition system (LRS) module uses a novel formal grammar approach to accept DGL-wise valid sequences of dynamic gestures and reject invalid ones. LRS consists of two subsystems: one using a Linear Formal Grammar (LFG) to derive the valid sequence of dynamic gestures and another using a Stochastic Linear Formal Grammar (SLFG) to occasionally recover gestures that were unrecognized by the IP module. Experimental results have shown that the DGLR system had a slightly better overall performance when recognizing gestures made by a human subject (98.92% recognition rate) than those made by the android (97.42% recognition rate).
dc.faculty.departmentScience informatique et génie électrique/ Electrical Engineering and Computer science
dc.identifier.urihttp://hdl.handle.net/10393/32800
dc.identifier.urihttp://dx.doi.org/10.20381/ruor-4160
dc.language.isoen
dc.publisherUniversité d'Ottawa / University of Ottawa
dc.subjectdynamic arm gesture language
dc.subject3D HOG descriptor
dc.subjectbag-of-feature (BOF)
dc.subjectlocal part model
dc.subjectformal language
dc.subjectlinear formal grammar
dc.subjectstochastic linear formal grammar
dc.subjecthuman-robot communication
dc.subjectinter-robot communication
dc.titleVisual Recognition of a Dynamic Arm Gesture Language for Human-Robot and Inter-Robot Communication
dc.typeThesis
thesis.degree.disciplineGénie / Engineering
thesis.degree.levelDoctoral
thesis.degree.namePhD
uottawa.departmentScience informatique et génie électrique/ Electrical Engineering and Computer science

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail ImageThumbnail Image
Name:
Abid_Muhammad_Rizwan_2015_thesis.pdf
Size:
9.46 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail ImageThumbnail Image
Name:
license.txt
Size:
4.07 KB
Format:
Item-specific license agreed upon to submission
Description: