Hands Free

With smart phones becoming prolific in modern society, being able to access your devices at all times is increasingly important. Modern touch screen interfaces work well in many contexts, but are limited when it is infeasible to accurately touch the screen directly, such as when running, wearing gloves when the user has unclean hands. To deal with these issues we've developed an in-air gesture recognition system, dubbed "Hands Free". The system uses a single RGB camera on an unmodified smart phone as the only sensor. A convolutional neural network classifier combined with a novel pre-processing technique to perform color invariant recognition allows the classification of hand shapes.

Using this method we've created several demo applications:

  1. A message reader, which allows the user to trigger text to speech reading of incoming messages
  2. A running timer, which allows the user to time operate a stopwatch using gestures only
  3. A music player, which allows the user to use gestures to control music playback

In each of these use cases we found that the system was able to perform equally well in indoor and outdoor scenarios and was robust to the noise caused by a moving camera and hand.

Matthew Chang, Misha Sra and Chris Schmandt