Communities that use visual languages like American Sign Language to communicate are greatly underrepresented by translation and language learning tools.
This project involves the development of an automated Sign Language Recognition application to create a translation and teaching tool for American Sign Language. 
 Our approach utilizes a simple, low-cost setup consisting of a LeapMotion controller and a laptop, and it does not require the user to wear uncomfortable clothing or sign in isolated, clean background setups. Our algorithm uses measurements provided by the LeapMotion controller in a decision tree, and it does not require large amounts of training data like other machine learning based approaches. 
Back to Top