Open Exhibits Tutorials

Tutorials

  
  

Multiple Affine Listeners

  

Add Interactive Touch Objects with AS3 and GML, Method 1, Part 3

Tutorial Series: Add Interactive Touch Objects with AS3 & GML: Method 1

Tutorial 3:Add Multiple gestureEvent Listeners and Affine Transform Handlers with ActionScript 3 and GML

Introduction

In this example tutorial we are going to use actionscript to add a series of (GML defined) multitouch gestures to a touch object. The gesture events will be managed using traditional listeners/handlers which will animate the touch objects using the new Open Exhbits 2 affine transform methods. This tutorial requires Adobe Flash CS5+ and Open Exhibits 2 (download the sdk).

Getting Started

As in previous examples, touchSprites can be constructed and properties set in much the same way as a Sprite or MovieClip. For example:


1
var ts1:TouchSprite = new TouchSprite();

A bitmap image can then be dynamically loaded and placed into the touch object “ts1“.


2
3
4
var Loader1:Loader = new Loader();
Loader1.load(new URLRequest("library/assets/building1.jpg"));
ts1.addChild(Loader1);

The display properties are set and the TouchSprite is placed on stage.


5
6
7
8
9
10
ts1.x = 100;
ts1.y = 200;
ts1.rotation = -20;
ts1.scaleX = 0.8;
ts1.scaleY = 0.8;
ts1.addChild(Loader1);

The first step in adding touch interaction to a touch object is the local activation of all required gestures. To do this we add the gesture (as identified by the “gesture_id” in the GML root document) to the gestureList property associated with the touch object. For example (using the in-line method for adding items to an object list):


11
ts1.gestureList = { "n-drag":true, "n-scale":true, "n-rotate":true };

This adds the “n-drag“, “n-rotate” and “n-scale” gestures** to the touch object, effectively activating matching, analysis and processing for the three gestures. These gestures are uniquely defined in the root GML document, “my_gestures.gml,” located in the bin folder of the application. An example GML description of the gestures in this tutorial can be found at: http://www.gestureml.org/example/gml_example.xml

This enables gestureEvents to be continually* dispatched when a gesture is recognized on the touch object. At this stage, if listeners and handlers are not added to the touch object, nothing is done with the gestureEvents. To actively monitor gestureEvents on the touch object we will need to add event listeners.


12
13
14
ts1.addEventListener(GWGestureEvent.DRAG, gestureAffineDragHandler);
ts1.addEventListener(GWGestureEvent.ROTATE, gestureAffineRotateHandler);
ts1.addEventListener(GWGestureEvent.SCALE, gestureAffineScaleHandler);

In this example we will animate the touch object so that it appears to be “physically” manipulated when touched. When groups of touch points are placed on a touch object they are analysed as a group or “cluster” for geometric properties. These cluster properties are then processed and changes in the values returned as “deltas“. For example when two or more touch points rotate around a common center the change in orientation “dtheta” is calculated, processed and returned in the form of a “ROTATE” gesture event.

When a GWGestureEvent.ROTATE event is detected the function “gestureRotateHandler” is called .This directly controls what is done with the event. For example:


15
16
17
18
private function gestureAffineRotateHandler(event:GWGestureEvent):void
{
    event.target.$rotation += event.value.dtheta;
}

Each time the gesture rotate handler is called it adds the event value “dtheta” to the current value of the touch object rotation. This has the effect of dynamically rotating the touch object as the touch point cluster rotates.

A handler is also added to manage what happens when a “SCALE” gesture is detected:


19
20
21
22
23
private function gestureAffineScaleHandler(event:GWGestureEvent):void
{
    event.target.$scaleX *= event.value.dsx;
    event.target.$scaleY *= event.value.dsy;
}

This multiplies the current scale of the touch object by the deltas “dsx” and “dsy” which represent the change in the separation of the touch point cluster in the x and y direction.

Additionally a handler is added to manage the “DRAG” gesture event:


24
25
26
27
28
private function gestureAffineDragHandler(event:GWGestureEvent):void
{
    event.target.$x += event.value.dx;
    event.target.$y += event.value.dy;
}

This handler adds the change in the position of the touch point cluster “dx” and “dy” to the current position of the touch object.

In this example three gestures are attached to the touch object “ts1“, each one can dispatch independent events in the same frame. If all gesture events are detected at the same time it results in a blended interaction that allows the object to be dragged, rotated and scaled simultaneously.

Using the “$” syntax in front of the display properties ( $x, $y, $scaleX, $scaleY and $rotation) gives access to the affine transform methods available in the TouchSprite and TouchMovieClip (provided in OE2*). Affine transform methods allow transformations such as translations, rotations and scaling to occur around a dynamic center of motion (automatically defined by the touch point cluster). Working with affine transform methods enables a more “realistic” physical motion as it allows the manipulation of a touch object with multiple points of contact. This prevents the touch object from “moving under your fingertips” and allows it to be dynamically pivoted around any touch point in the cluster.

One of the powerful new features in Open Exhibits 2 is the ability of the “gesture analysis engine” to actively fold multiple property calculations into a single compressed operation. This means that when multiple gestures are added to a touch object, multiple cluster properties can be acquired in parallel while avoiding redundant calculations. The result is that all CPU resources used for managing gesture analysis and display object transformations are fully optimized within a flexible architecture.

* GestureEvents are continually dispatched if the match conditions are continually met. If no touch points are detected on a touch object no analysis is performed and no gesture events are dispatched. This ensures dormant touch objects do not perform unnecessary processing.

Note: In addition to the actionscript methods used in this tutorial there are new “Native” (CML) methods which allow the construction of new touch objects using a secondary external XML document in addition to the GML document.