Gesture Recognition Toolkit
Welcome to the main wiki page for the Gesture Recognition Toolkit.
The Gesture Recognition Toolkit (GRT) is a cross-platform, open-source, C++ machine learning library that has been specifically designed for real-time gesture recognition.
In addition to a comprehensive C++ API, the GRT now also includes an easy-to-use graphical user interface:
GRT Graphical User Interface. GUIImage1.png
You can find out more about the GUI here.
The GRT has been designed to:
- be easy to use and integrate into your existing C++ projects
- be compatible with any type of sensor or data input
- be easy to rapidly train with your own gestures
- be easy to extend and adapt with your own custom processing or feature extraction algorithms (if needed)
The GRT features a large number of algorithms that can be used to:
- recognize static postures (such as if a user has their hands in a specific posture or if a device fitted with an accelerometer is being held in a distinct orientation)
- recognize dynamic temporal gestures (such as a swipe or tap gesture)
- perform regression (i.e. continually map an input signal to an output signal, such as mapping the angle of a user's hands to the angle a steering wheel should be turned in a driving game)
(This is a poster for the GRT presented recently at the New England Machine Learning Day 2013) GRTPoster.jpg
The GRT currently works across several operating systems including:
- Windows (Tested on Windows XP, Windows 7)
- OS X (Tested on 10.7)
- Linux (Tested on Ubuntu 12)
The current development build of the GRT contains machine-learning algorithms such as:
- Adaptive Naive Bayes Classifier
- Decision Tree
- Dynamic Time Warping
- Gaussian Mixture Models (GMM)
- Hidden Markov Model
- K-Nearest Neighbor Classifier
- Random Forests
- Support Vector Machine
- Artificial Neural Network (Multi Layer Perceptron)
- Linear Regression
- Logistic Regression
- Multidimensional Regression
In addition to the machine-learning algorithms, the GRT also contains a large number of pre-processing, post-processing, and feature-extraction algorithms such as:
- Low Pass Filter
- High Pass Filter
- Moving Average Filter
- Double Moving Average Filter
- Dead Zone
- Savitzky-Golay Filter
- Zero Crossing Counter
- FFT Features
- Movement Trajectory Features
- Principal Component Analysis
- Class Label Filter
- Class Label Change Filter
- Class Label Timeout Filter
The current development version of the GRT can be downloaded from http://code.google.com/p/gesture-recognition-toolkit/.
Check out the getting started page to find out more.
You can find a large number of tutorials on the tutorials page.
You can find a large number of code references on the reference page.
Got a question? Want to share your own code, projects, or expertise? You can do this all on the GRT Forum page.