Recognition of Arm Gestures Using Multiple Orientation Sensors.

Martin Urban, Peter Bajcsy and Rob Kooper

Technical Report NCSA-ALG04-0004, July 2004

We present a system for gesture recognition using multiple orientation sensors. We focus specifically on the problem of controlling Unmanned Aerial Vehicles (UAVs) in the presence of manned aircrafts on an aircraft carrier deck. Our goal was to design a UAV control with the same gesture signals as used by current flight directors for controlling manned vehicles. We have explored multiple approaches to arm gesture recognition, and investigated real-time and system design issues for a particular choice of active sensors.

We describe several theoretical and experimental issues related to a design of a real-time gesture recognition system using the IS-300 Pro Precision Motion Tracker by InterSense.

Our work consists of (1) analyzing several gesture recognition approaches leading to a selection of an active sensor, (2) scrutinizing sensor data acquisition parameters and reported arm orientation measurements, (3) choosing the most optimal attachment and placement of sensors, (4) measuring repeatability of our experiments using Dynamic Time Warping (DTW) metric, and (5) designing template-based gesture classification algorithms and robot control mechanisms, where the robot represents an UAV surrogate in a laboratory environment.