Sep 28, 2011

Gesture detection in Android, part 2 of 2 ( Character Dector )

From Android 1.6 onwards includes a new package android.gesture which is used to for complex gesture recognition. This package includes APIs to store, load, draw and recognize gestures. We can define our own pre-defined patterns in our application and store these gestures in a file and later on use this file to recognize the gesture.

Gestures Builder application

There is a handy sample application, Gestures Builder, which comes with the Android 1.6 and higher. This application is pre-installed in 1.6 and higher emulators. Here is a screenshot of the application:

Gesture Builder Sample application
Using this application we can create our gesture library and save it to SD card. Once the file is created we can include this file in our application in /res/raw folder.

Loading a gesture library

To load the gesture file, we use the class GestureLibraries class. This class has functions to load from resource, SD card file or private file. GestureLibraries class has following methods:

static GestureLibrary fromFile(String path)
static GestureLibrary fromFile(File path)
static GestureLibrary fromPrivateFile(Context context, String name)
static GestureLibrary fromRawResource(Context context, int resourceId)
All these methods return a class GestureLibrary. This class is used to read gestures entries from file, save gestures entries to file, recognize the gestures, etc. Once the GestureLibraries return a GestureLibrary class that corresponds to the file specified, we read all the gesture entries using GestureLibrary.load method.

Drawing and recognize a gesture

To draw and recognize gestures, we use the class GestureOverlayView. This view extends the FrameLayout, i.e. we can use it inside any other layout or use it as a parent layout to include other child views. This view acts as an overlay view and the user can draw gestures on it. This view uses three callback interfaces to report the actions performed, they are:

interface GestureOverlayView.OnGestureListener
interface GestureOverlayView.OnGesturePerformedListener
interface GestureOverlayView.OnGesturingListener
The GestureOverlayView.OnGestureListener callback interface is used to handle the gesture operations in low-level. This interface has following methods:

void onGestureStarted(GestureOverlayView overlay, MotionEvent event)
void onGesture(GestureOverlayView overlay, MotionEvent event)
void onGestureEnded(GestureOverlayView overlay, MotionEvent event)
void onGestureCancelled(GestureOverlayView overlay, MotionEvent event)

All these methods have two parameters GestureOverlayView and MotionEvent and represent the overlay view and the event that occurred.

The GestureOverlayView.OnGesturingListener callback interface is used to find when the gesture is started and ended. The interface has following methods:

void onGesturingStarted(GestureOverlayView overlay)
void onGesturingEnded(GestureOverlayView overlay)
The onGestuingStarted will be called when gesture action is started and onGesturingEnded will be called when the gesture action ended. Both these methods contain the GestureOverlayView that is used.

Most important is the GestureOverlayView.OnGesturePerformedListener interface. This interface has only one method:

void onGesturePerformed(GestureOverlayView overlay, Gesture gesture)
This method is called when the user performed the gesture and is processed by the GestureOverlayView. The first parameter is the overlay view that is used and the second is a class Gesture that represents the user performed gesture. Gesture class represents a hand drawn shape. This representation has one or more strokes; each stroke is a series of points. The GestureLibrary class uses this class to recognize gestures.

To recognize a gesture, we use GestureLibrary.recognize method. This method accepts a Gesture class. This method recognizes the gestures using internal recognizers and returns a list of predictions. The prediction is represented by Prediction class and contains two member variables name and score. The name variable represents the name of the gesture and score variable represents the score given by the gesture recognizer. This score member is used to choose the best matching prediction from the list. One of the common methods is to choose the first value that has score greater that one. Another method used is to choose the value that is inside a minimum and maximum threshold limit. Choosing this threshold limit is entirely depends on the implementation ranging from simple limits based on trial and error methods and more complex methods that may include leaning the user inputs and improve the recognition based on that.

Sample application

The sample application that accompanies this article includes 5 pre-defined gestures A, B, C, D and E. When the user draws these patterns the application lists the names and scores of all the predictions.

Example Loding Predifined Gesture From Raw folder :

GestureLibrary gesturesLibrery=GestureLibraries.fromRawResource(this, R.raw.gestures_name);

setting listener to Gesture :


Checking Gesture :

if (gesture.getGesture() != null) {
String str1 = recGesture(gesture.getGesture());
print gesture sucess

print gesture not foud

recognizing Gesture using Prediction Class :
private String recGesture(Gesture gesture) {
ArrayList alPredictions = mGesLib.recognize(gesture);
if (alPredictions.size() > 0) {
Prediction pRecGes = alPredictions.get(0);
Log.e("recGesture true", + alPredictions.toString());

// Toast.makeText(getApplicationContext(),,Toast.LENGTH_SHORT).show();
} else {
Log.e("recGesture", "did not find");
// Toast.makeText(getApplicationContext(),"did not find"+alPredictions.size(),Toast.LENGTH_SHORT).show();
return "-1";

For Source Check here :

code from : krvarma

Hope this article helps you to understand complex gesture recognition in Android.

No comments:

Post a Comment