| Name: | Description: | Size: | Format: | |
|---|---|---|---|---|
| 581.35 KB | Adobe PDF |
Authors
Advisor(s)
Abstract(s)
The main problem this work addresses is the real-time recognition of gestures, particularly in the
complex domain of artistic performance. By recognizing the performer gestures, one is able to map
them to diverse controls, from lightning control to the creation of visuals, sound control or even music
creation, thus allowing performers real-time manipulation of creative events. The work presented
here takes this challenge, using a multidisciplinary approach to the problem, based in some of the
known principles of how humans recognize gesture, together with the computer science methods to
successfully complete the task. This paper is a consequence of previous publications and presents in
detail the Gesture Recognition Module of the ZatLab Framework and results obtained by its Machine
Learning (ML) algorithms. One will provide a brief review the previous works done in the area,
followed by the description of the framework design and the results of the recognition algorithms.
Description
Keywords
Computer Vision DTW Gesture Recognition HCI HMM Interactive Performance Kinect Machine Learning
Pedagogical Context
Citation
Baltazar, A. (2016). ZatLab Gesture Recognition Framework: Machine Learning Results. International Journal of Creative Interfaces and Computer Graphics, (7)2, 11-24
