Open Exhibits Tutorials

Tutorials

  
  

Multiple Listeners

  

Add Interactive Touchs Object with AS3 and GML, Method 1, Part 2

Tutorial Series: Add Interactive Touch Objects with AS3 & GML: Method 1

Tutorial 2:Add Multiple GML Defined Gesture Listeners with ActionScript 3 and GML

Introduction

In this example tutorial we are going to use actionscript to add a series of (GML defined) multitouch gestures to a touch object and manage gesture events using the traditional event listener/handler model. This tutorial requires Adobe Flash CS5+ and Open Exhibits 2 (download the sdk).

Agama sinaita image by Ester Inbar, available from http://commons.wikimedia.org/wiki/User:ST.

How to add Multiple GML Defined Gesture Listeners with ActionScript 3

As in previously shown examples touchSprites can be constructed and display properties set in much the same way as a Sprite or MovieClip. For example:


1
2
3
4
5
6
7
8
9
10
var ts1:TouchSprite = new TouchSprite()
var Loader1:Loader = new Loader();
Loader1.load(new URLRequest("library/assets/lizzard1.jpg"));
Loader1.contentLoaderInfo.addEventListener(Event.COMPLETE, onloadComplete);
ts1.addChild(Loader1);
ts1.x = 100;
ts1.y = 200;
ts1.rotation = -20;
ts1.scaleX = 0.8;
ts1.scaleY = 0.8;

This creates a new touchSprite called “ts1” then positions, rotates and scales it on stage. Once the bitmap is loaded it is added to the touchSprite re-positioned using the width and height so that it’s registration point is now in the middle of the touchSprite.


12
13
14
15
16
private function onloadComplete(event:Event):void
{
    event.target.loader.x = -event.target.width / 2;
    event.target.loader.y = -event.target.height / 2;
}

The first step in adding touch interaction to touch object is the local activation of all required gestures. To do this we add the gesture (as identified by the “gesture_id” in the GML root document) to the gestureList property associated with the touch object. For example (using the in-line method for adding items to an object list):


17
ts1.gestureList = { "n-drag":true, "n-scale":true, "n-rotate":true };

This adds the “n-drag“, “n-rotate” and the “n-scale” gestures** to the touch object effectively activating matching, analysis and processing for the three gestures. These gestures are uniquely defined in the root GML document “my_gestures.gml” located in the bin folder of the application. An example GML description of the gestures in this tutorial can be found at http://www.gestureml.org/example/gml_example.xml

This enables gestureEvents to be continually* dispatched when a gesture is recognized on the touch object. At this stage if listeners and handlers are not added to the touch object nothing is done with the gestureEvents. To actively monitor gestureEvents on the touch object we will need to add event listeners.


18
19
20
ts1.addEventListener(GWGestureEvent.DRAG, gestureDragHandler);
ts1.addEventListener(GWGestureEvent.ROTATE, gestureRotateHandler);
ts1.addEventListener(GWGestureEvent.SCALE, gestureScaleHandler);

In this example we will animate the touch object so that it appears to be “physically” manipulated when touched. When groups of touch points are placed on a touch object they are analysed as a group or “cluster” for geometric properties. These cluster properties are then processed and changes in the values returned as “deltas”. For example when two or more touch points rotate around a common center the change in orientation “dtheta” is calculated, processed and returned in the form of a “ROTATE” gesture event.

When a GWGestureEvent.ROTATE event is detected the function “gestureRotateHandler” is called .This directly controls what is done with the event. For example:


21
22
23
24
private function gestureRotateHandler(event:GWGestureEvent):void
{
    event.target.rotation += event.value.dtheta;
}

Each time the gesture rotate handler is called it adds the event value “dtheta” to the current value of the touch object rotation. This has the effect of dynamically rotating the touch object as the touch point cluster rotates.

A handler is also added to manage what happens when a “SCALE” gesture is detected:


25
26
27
28
29
private function gestureScaleHandler(event:GWGestureEvent):void
{
    event.target.scaleX *= event.value.dsx;
    event.target.scaleY *= event.value.dsy;
}

This multiplies the current scale of the touch object by the deltas “dsx” and “dsy” which represent the change in the separation of the touch point cluster in the x and y direction.

Additionally a handler is added to manage the “DRAG” gesture event:


30
31
32
33
34
private function gestureDragHandler(event:GWGestureEvent):void
{
    event.target.x += event.value.dx;
    event.target.y += event.value.dy;
}

This handler adds the change in the position of the touch point cluster “dx” and “dy” to the current position of the touch object.

In this example three gestures are attached to the touch object “ts1“, each one can dispatch independent events in the same frame. If all gesture events are detected at the same time it results in a blended interaction that allows the object to be dragged, rotated and scaled simultaneously. Since the registration point of the TouchSprite is in the center, all transformations occur around the center of the image*.

One of the powerful new features in Open Exhibts 2 is the ability of the “gesture analysis engine” to actively fold multiple property calculations into a single compressed operation. This means that when multiple gestures are added to a touch object, multiple cluster properties can be acquired in parallel while avoiding redundant calculations. The result is that all CPU resources used for managing gesture analysis and display object transformations are fully optimized within a flexible architecture.

* GestureEvents are continually dispatched if the match conditions are continually met. If no touch points are detected on a touch object no gesture events are fired. This ensures dormant touch objects do not require unnecessary processing.

Note: In addition to the actionscript methods used in this tutorial there are new “Native” (CML) methods which allow the construction of new touch objects using a secondary external XML document in addition to the GML document.