Open Exhibits Tutorials

Tutorials

  
  

Multiple Listeners

  

Add Interactive Touch Objects with AS3 and GML, Method 2, Part 2

Tutorial Series: Add Interactive Touch Objects with AS3 & GML: Method 2

Tutorial 2: Add Multiple GML-Defined Manipulations

Introduction

In this tutorial we are going to create a simple touch object using actionscript and attach multiple (GML-defined) multitouch gestures. This method will use the built-in automatic transform methods to directly control how touch objects are manipulated by gestures. This tutorial requires Adobe Flash CS5+ and Open Exhibits 2 (download the sdk).

Getting Started

The first step in creating an interactive touch object is to construct a new instance of a TouchSprite object. This is done using methods similar to creating Sprite. For example:


1
var ts0:TouchSprite = new TouchSprite();

As with Sprites and MovieClips; images can be dynamically loaded directly into the display object.


2
3
4
var Loader0:Loader = new Loader();
Loader0.load(new URLRequest("library/assets/crystal2.jpg"));
ts0.addChild(Loader0);

TouchSprites and TouchMovieClips inherit the complete set public properties available to Sprites and MovieClips. This allows the display object properties to be treated in the same way. For example the following code positions, rotates and scales then places the touch sprite “ts0” on the stage:


5
6
7
8
9
10
ts0.x = 200;
ts0.y = 100;
ts0.rotation = 45;
ts0.scaleX = 0.5;
ts0.scaleY = 0.5;
addChild(ts0);

Unlike Sprites and MovieClips, all TouchSprites and TouchMovieClips have the ability to detect and process multitouch gestures as defined by the root GML document. All touch objects in a GestureWorks3 application have access to the gestures defined in the root GML document “my_gestures.gml” located in the bin folder. However, the touch objects will only respond to specific touch gestures when explicitly attached. To attach and activate gestures on a touch object you must add the gesture to the gestureList property and set it to true. This can be done using an in-line method, for example:


11
ts0.gestureList = {"n-drag":true,"n-scale":true,"n-rotate":true};

This can also be done by creating a new object and then directly assigning it to the gestureList property.


12
13
14
15
16
var gList:Object = new Object;
    gList["n-drag"] = true;
    gList["n-rotate"] = true;
    gList["n-scale"] = true;
ts0.gestureList = gList;

This adds the gesture “n-drag“, “n-rotate” and “n-scale” to the touch object* effectively activating gesture analysis on “myTouchSprite“. Any touch point placed on the touch object is added to the local cluster. The touch object will inspect touch point clusters for a matching gesture “action” and then calculate cluster motion and geometry. The result is then processed and prepared for mapping.

The traditional event model in flash employs the explicit use of event listeners and handlers to manage gesture events on a touch object. However, Gesture Markup Language can be used to directly control how gesture events map to touch object properties and therefor how touch objects are transformed. These tools are integrated into the gesture analysis engine inside each touch object and allow custom gesture manipulations and property updates to occur on each touch object.


17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
<Gesture id="n-drag" type="drag">
    <match>
        <action>
            <initial>
                    <cluster point_number="0" point_number_min="1" point_number_max="5" translation_threshold="0"/>
            </initial>
        </action>
    </match>
    <analysis>
        <algorithm>
        <library module="drag"/>
        <returns>
            <property id="drag_dx"/>
            <property id="drag_dy"/>
        </returns>
        </algorithm>
    </analysis>
    <processing>
        <inertial_filter>
            <property ref="drag_dx" release_inertia="false" friction="0.996"/>
            <property ref="drag_dy" release_inertia="false" friction="0.996"/>
        </inertial_filter>
    </processing>
    <mapping>
        <update>
            <gesture_event>
                <property ref="drag_dx" target="x" delta_threshold="true" delta_min="0.01" delta_max="100"/>
                <property ref="drag_dy" target="y" delta_threshold="true" delta_min="0.01" delta_max="100"/>
            </gesture_event>
        </update>
    </mapping>
</Gesture>

In this example the gesture “n-drag” as defined in the root GML document (“my_gestures.gml“) directly maps the values returned from gesture processing “drag_dx” and “drag_dy” to the “target” “x” and “y“. Internally the delta values are added to the “$x” and “$y” properties of the touch object. This translates the object on stage to the center of the touch point cluster.


49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
<Gesture id="n-scale" type="scale">
    <match>
        <action>
            <initial>
                <cluster point_number="0" point_number_min="2" point_number_max="5" separation_threshold="0"/>
            </initial>
        </action>
    </match>
    <analysis>
        <algorithm>
            <library module="scale"/>
            <returns>
                <property id="scale_dsx"/>
                <property id="scale_dsy"/>
            </returns>
        </algorithm>
    </analysis>
    <processing>
        <inertial_filter>
            <property ref="scale_dsx" release_inertia="false" friction="0.996"/>
            <property ref="scale_dsy" release_inertia="false" friction="0.996"/>
        </inertial_filter>
    </processing>
    <mapping>
        <update>
            <gesture_event>
                <property ref="scale_dsx" target="scaleX" func="linear" factor="0.0033" delta_threshold="true" delta_min="0.0001" delta_max="1"/>
                <property ref="scale_dsy" target="scaleY" func="linear" factor="0.0033" delta_threshold="true" delta_min="0.0001" delta_max="1"/>
            </gesture_event>
        </update>
    </mapping>
</Gesture>

The gesture “n-scale” as defined the root GML document “my_gestures.gml” directly maps the values returned from gesture processing “scale_dsx” and “scale_dsy” to the “target” “scaleX” and “scaleY“. Internally the delta values multiplied with the “$scaleX” and “$scaleY” properties of the touch object and scales the object on stage about the center of the touch point cluster.


81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
<Gesture id="n-rotate" type="rotate">
    <match>
        <action>
            <initial>
                <cluster point_number="0" point_number_min="2" point_number_max="5" rotatation_threshold="0"/>
            </initial>
        </action>
    </match>
    <analysis>
        <algorithm>
            <library module="rotate"/>
            <returns>
                <property id="rotate_dtheta"/>
            </returns>
        </algorithm>
    </analysis>
    <processing>
        <noise_filter>
            <property ref="rotate_dtheta"  noise_filter="false" percent="30"/>
        </noise_filter>
        <inertial_filter>
            <property ref="rotate_dtheta" release_inertia="false" friction="0.996"/>
        </inertial_filter>
    </processing>
    <mapping>
        <update>
            <gesture_event>
                <property ref="rotate_dtheta" target="rotate" delta_threshold="true" delta_min="0.1" delta_max="10"/>
            </gesture_event>
        </update>
    </mapping>
</Gesture>

The gesture “n-rotate” as defined the root GML document “my_gestures.gml” directly maps the value returned from gesture processing “rotate_dtheta” to the “target” “rotation“. Internally the delta value added to the “$rotation” property of the touch object and the object is rotated about the center of the touch point cluster on stage.

When touch points are placed on the touch object they are collected into a local “cluster“. The motion of this cluster is inspected for matching gesture actions and then the motion and geometry analysed, processed and returned for mapping. Each gesture activated on the touch object can be triggered independently. When multiple gestures are detected in the same frame the touch object undergoes a blended transformation which results in animated motion on stage.

Any gesture process can be interrupted or stopped during runtime by simply setting the gesture to false in the gestureList. For example:


113
ts0.gestureList = {"n-drag"false,"n-scale":true,"n-rotate":true};

This will halt the continuous* “n-drag” gesture analysis and processing on the touch object without removing it, enabling it to be re-engaged later if required.

Adding the three gestures “n-drag“,”n-scale” and “n-rotate” in combination allow the touch object to be interactively moved, scaled and rotated around a dynamic center of motion. This gives a “natural” feel to the touch objects as it can pivot and scale around any touch point in the cluster. These “affine” transformations are managed internally to the touchsprite.

One of the powerful new features in Open Exhibits 2 is the ability of the “gesture analysis engine” to actively fold multiple property calculations into a single compressed operation. This means that when multiple gestures are added to a touch object, multiple cluster properties can be acquired in parallel while avoiding redundant calculations. The result is that all CPU resources used for managing gesture analysis and display object transformations are fully optimized into a micro-engine custom built around the requirements of each touch object.

The benefit of using this method is that complex gesture based manipulations can be added to a touch object in a few simple lines of code. The manipulations are internally managed by the TransfomManager class integrated into TouchSprite and TouchMovieClip. Any transformations that occur on a touch object can be inspected (if required) by listening to GestureEvents or TransfromEvents but can otherwise be managed entirely “automatically”.

This method mimics methods of best practice used when dynamically assigning object-based media assets and formatting. In effect, this provides a framework that fully externalizes object gesture descriptions and interactions.The result of this approach allows developers to efficiently refine UI/UX interactions without the need to recompile applications.

* GestureEvents are continually dispatched if the match conditions are continually met. If no touch points are detected on a touch object no gesture events are fired. This ensures dormant touch objects do not require unnecessary processing.