Skip to content

maitarasher/Gesture-Recognition

Repository files navigation

Gesture-Recognition in C++

This library is a collaborative effort developed as the final group project for the COMS W4995: Design Using C++ course in Fall 2023, taught by Professor Bjarne Stroustrup.

Authors: Yana Botvinnik, Maitar Asher, Noam Zaid, Elvina Wibisono, Elifia Muthia

About

This is a Gesture Recognition library written for C++ that allows users to effortlessly create models capable of recognizing gestures in images, live video streams, or recordings. Note: As of December 2023, this library is exclusively tested and compatible with MacOS. Our library is designed to run alongside Google's MediaPipe Libraries. We've set up a server that leverages Google’s MediaPipe and has been configured to work for MacOS as a default. The server plays a vital role in obtaining hand landmarks, a crucial step in our program that simplifies the classification problem. Instead of dealing with numerous pixels for each image, our approach involves working with 21 landmarks for each image.

Acknowledgment

This project takes inspiration from a similar gesture recognition library in Python, the GRLib, that's algorithm has been nicely documented here.

Additional Resources and Documentation

  • Gesture Recognition Tutorial: Explore the step-by-step tutorial on setting up the environment, processing data, training models, and making predictions.

  • Gesture Recognition Manual: For more detailed information about each step, configuration options, and advanced features, refer to our comprehensive manual.

  • Design Documentation: Read our design documentation for an in-depth understanding of the architecture, system components, and implementation details.

  • ASL Alphabet Recognition Application Demo: See the tool in action here!

  • Powerpoint Gesture Control Application Demo: See the tool in action here!

Set Up the Environment

  1. Install Homebrew (a package manager to install library dependencies)

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

  1. Install OpenCV

brew install opencv

  1. Install bazelisk

brew install bazelisk

  1. Install opencv@3

brew install opencv@3

  1. Install ffmpeg

brew install ffmpeg

  1. Install Numpy

brew install numpy

  1. Install XCode from the App Store

Set Up the MediaPipe Server

  1. Clone the modified Mediapipe repository

  2. Move into the newly cloned Mediapipe repository

cd mediapipe

  1. Build the file

bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/mediapipe_samples/mediapipe_sample:mediapipe_sample

  1. Execute the following command to run the server

GLOG_logtostderr=1 bazel-bin/mediapipe/mediapipe_samples/mediapipe_sample/mediapipe_sample --calculator_graph_config_file=mediapipe/graphs/hand_tracking/hand_tracking_desktop_live.pbtxt

Clone this Gesture-Recognition repository

Execute the following command to clone this repository

git clone https://github.com/maitarasher/Gesture-Recognition.git

Process Training Data

  1. Optional: Customize your augmentation pipeline

You may add/remove stages into your pipeline. The code is located at: Gesture-Recognition/processing_data/processing.cpp

  1. Navigate to Gesture-Recognition root directory

  2. Build the processing application

cd processing_data

mkdir build

cd build

cmake ..

make

  1. Compile the processing application to generate landmarks representation of your data

./processing <training_images_dir_path> <output_folder>

Running the ASL Application Example

  1. Navigate to Gesture-Recognition root directory

  2. Build the application

cd gesture_asl

mkdir build

cd build

cmake ..

make

  1. Run the Application

./asl_application ../../data/asl

Running the Powerpoint Controller Example

  1. Run the commands below for the dependencies

brew install jsoncpp

brew install pkg-config

  1. Navigate to Gesture-Recognition root directory

  2. Run the script to prepare the data

cd processing_data/processing_cocodataset

mkdir build

cd build

cmake ..

make

./coco_dataset_export <coco_folder_path> <output_folder>

  1. Navigate to Gesture-Recognition root directory

  2. Build the application

cd gesture_pptx

mkdir build

cd build

cmake ..

make

./gesture_pptx ../../data/pptx <path_to_pptx>

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published