Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Development of a Notation Method for Hand Gesture Vocabularies based on a 3D Free Hand Gesture Taxonomy

Title
Development of a Notation Method for Hand Gesture Vocabularies based on a 3D Free Hand Gesture Taxonomy
Authors
최은정
Date Issued
2015
Publisher
포항공과대학교
Abstract
In this study, I developed a notation method that can organize and notate hand gestures in a systematic manner. The specific objectives are as follows: 1) the development of a taxonomy of hand gestures and a notation method; 2) the development of a text-based notation method of hand gestures and its’ verification; 3) the development of learning models for pattern classifications of various hand gesture vocabularies matched with one command and verification of the models. For organizing hand gestures in a systematic manner I defined basic elements of hand gestures and then derived sub-elements of the basic elements by analyzing related studies. Subsequently I developed the 3D free hand gesture taxonomy based on the elements and sub-elements, and devised the notation method based on a combination of the elements that matched an integer code for easy notation. In this study, I defined that a hand gesture is a successive combination of posture(s) and movement(s). A posture is defined as a particular hand shape represented by a single image that consists of seven elements: number of hand(s) involved, hand type, special relationship between the hands (only for two-handed gestures), hand location, hand shape, hand orientation and arm posture. Movement refers to elements that make dynamic gestures, and consists of two elements: path movement and wrist movement. The text-based notation method including all of the elements in the taxonomy was devised to notate hand gestures by hand and decode the hand gestures using the notated codes. Through an additional experiment of deriving gestures from users and a pilot test, the additional elements were added into the taxonomy and also the notation method was modified to include them. Finally, the usefulness of the notation method was verified by training participants to notate hand gestures and by asking another set of participants to decode the notated gestures. As a result, except for arm posture (shoulder angle (SA), elbow angle (EA)), fleiss’ kappa of all of the elements was 1 which shows almost perfect agreement among participants. In the case of SA and EA for both hand, fleiss’ kappa was 0.59 (moderate agreement) to 0.99 (almost perfect agreement). This results shows that the same hand gestures can be notated and decoded across the users. However, this research only dealt with 11 commands (22 gestures) of a music player for the experiments, so further experiments with additional commands should be used for verification of the taxonomy and notation method. Finally, a future study will regard additional elements such as the size or speed of hand gestures that I have not considered in this study. Based on the notation method, I developed the learning models to classify patterns of gesture vocabularies for smart-home appliances and then verified the learning models. First, 1200 hand gestures for a total of 24 commands of 7 products, TV, light, air-conditioner, faucet, window(s), blind(s), and door(s) were derived from 70 participants. Based on the user-defined gestures a hand gesture library was established. Second, the notation method was modified to allow use of user-defined gestures as input data for computer analysis. Then the user-defined gestures in the library were notated using the modified method. Third, a hand-gesture learning model for each target product was developed using artificial neural networks. Finally, the developed models were validated experimentally in tests on a new group of users. The total hit rates were more than 98% for all of the models used for 75% seen data and 25% unseen data. These results show that the selected elements including the notation method are efficient to classify various gestures, and also that one command can be mapped to several gesture vocabularies. Due to the inaccuracy of currently-available gesture-recognition equipment, the researcher notated the user-defined gestures by observation only. The suggested models assumed 100% accuracy of the equipment, whereas in reality the final hit rate for the models is dependent on the accuracy of the equipment. Also the models did not consider the speed of hand(s) because a researcher notated the gesture visually. In future studies the above issues should be considered to verify the models. The hand gesture taxonomy and the notation method establish a foundation for a systematic approach for organizing hand gesture vocabularies. It has made the following contributions. First, this research has provided a thorough process for developing the hand gesture taxonomy and the notation method so that further research on improving the taxonomy and notation method can be conducted more easily. Second, textual records help the experimenter to find what he/she wants to identify at a glance without any extra system, such as a video player or a computer. In this respect, the notation method can be seen as a complement to or an alternative method for video recording. Third, pattern classifications by a computer are available because the notation method is based on the combination of elements matched with integer codes. In addition, if all of the elements suggested in the taxonomy can be recognized by equipment, the notation method could be useful to enhance the recognition rate of hand gestures. This is because the notation method is based on the small number of elements which can be combined to form a large number of hand gesture vocabularies (scalability). In short, the result of this study may suggest a good starting point for the further research on organizing and designing hand gesture vocabularies.
URI
http://postech.dcollection.net/jsp/common/DcLoOrgPer.jsp?sItemId=000001910325
https://oasis.postech.ac.kr/handle/2014.oak/92798
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse