Date of Award
Doctor of Philosophy
John A. Moya
Gesture control has been in development for over three decades but, although matured, has yet to see the level of adoption and deployment in interactive technology like that of other modalities such as voice control. Problems with unstandardized implementation and fragmented and awkward gesture commands have been hurdles to wider adoption of gesture control. However, advances in computer vision, machine learning and AI, as well as cheaper, more powerful, accessible hardware, have provided an opportunity to remedy this problem. This document proposes a natural gesture library containing gestures that are based on language. There is a distinct link between learning, language, and gesturing. By exploiting this link, the work done in this dissertation defines gestures that are designed based on simple word construction. The preliminary gesture design for the library was assessed using a survey with questions based on the UTAUT technology acceptance model. The survey includes questions on experience and habits with interactive technology, a handshape and connotation matching question, and short video clips of various abstract gestures along with questions on the impressions given by the gestures. In total, 2210 people participated in the survey, with 1000 responses being used for data analyses using statistical tests conducted using IBM SPSS Statistics 28 and IBM AMOS 27. Participants were about evenly split between the two genders, somewhat tilting female; were of ages 18 - 55; hailed from the regions of the Americas, Europe, and East Asia; were evenly experience with most common forms of modalities; and favored gestures with simple lateral motions, with either one or two hands. Lastly, the developed natural gesture library was validated via a simulation of a drone delivery system. The proposed work seeks to create a natural gesture library that can be a gesture control framework for a variety of devices and applications by providing simple and generalized gestures for an end user to learn and perform, as well as for developers to utilize.
Recieved from ProQuest
Morales, Erasmo, "Do As I Say - Natural Gesture Control as an Integral Component in the Progression of Human Computer Interaction" (2023). Open Access Theses & Dissertations. 3829.