Skip to content

TREE-Ind/Mycroft-Sign-Language-Translator

 
 

Repository files navigation

Making Mycroft respond to Sign Language using Tensorflow.js

Watch the video

The original project has been shared extensively across social media, and covered in the press: BBC, Verge, Mashable, Fast Co, Kottke, VentureBeat, NowThis and others

The goal of this project is to build upon the original by integrating directly with Mycroft AI, an open source and privacy focused smart assistant. Run the demo in latest Chrome/Firefox to train the model using your own words and corresponding signs/gestures. You will need a Mycroft device or instance running, it should respond, otherwise simply play around and have fun. You will need to give permission to access your webcam and microphone.

Running the code

To use the code, first install the JavaScript dependencies by running

npm install

Then start the local budo web server by running

npm start

This will start a web server on localhost:9966.

  1. Allow permission to your webcam and microphone.

  2. Add some words you want to train on.

Alt Text

Reference

To learn more about the classifier used in this repo go to KNN Image Classifier

There is a newer version of this classifier released in the new tensorflow.js which can be found here

About

A project to make Mycroft respond to sign language using your webcam.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 70.2%
  • HTML 16.3%
  • CSS 13.5%