Python app, recognising hand gestures from webcam in real time, using Scale Invariant Feature Transform algorithm.
Rough description of the algorithm:
-
Classifier training (see below).
-
Extracts frames from webcam video stream.
-
Dense SIFT descriptor is applied to points, uniformly distributed on the image, to frame to extract features.
-
Features are classified using pre-trained Bayes Classifier to determine hand gesture.
#Dependencies: OpenCV (webcam access)
VLFeat (SIFT descriptor)
Packages from requirements.txt
#Use To start, run
python GestureRecognition(camera).py
and place your palm inside of the green square. The letter will appear, indicating hand gesture.
#Examples:
App was trained to distinguish between 8 classes: [Nothing, A, B, C, F, P, V, Admin's handsome face].
#Training the Classifier For the app, I implemented and tried 2 classifiers - Bayes and k-Nearest Neighbours. Bayes was more accurate, classifying gestures correctly up to 90% of the time with good lighting conditions.
The classifier was trained using ~100 images I generated for each class (its settings are pickled in features.py
). If you would like to re-train it using the classifier, you can place images in the "test" folder in the root of the project and uncomment the lines in the beginning of GestureRecognition(camera).py
Happy playing!