Open Exhibits - Blog



New Tutorials for Open Exhibits 3.0+

Open Exhibits has undergone major changes for its 3.0 release. The biggest changes have been in the way the system handles 3D gestures with Leap Motion and rendering with Away3D. In order to get up to speed with the new changes, we have created a number of new tutorials.

To learn more about 3D gesture construction and configuration using GML, please follow these tutorials for detailed elaborations of our pre-existing gesture engine features and gesture markup language schema. In addition, introductory tutorials are in place to demonstrate the expansion of our framework's multimodal capabilities, recently integrated 3D motion point processing, 3D gesture construction, and 3D space interactions.

For tutorials on rendering with Away3D, please start with this one and try out the examples included with your download or browse the source on Github. More tutorials are also listed below in the rendering section.

New tutorial list

Input Modes

Advanced Gestures

Editing Gestures

Gesture Filters

Gesture Targeting

Motion Gestures


More Info

by Ken Willes View all posts by Ken Willes on Jan 8, 2014

25,000 Downloads of Open Exhibits!

We just surpassed 25,000 downloads of Open Exhibits software! The release of Open Exhibits 3 has really accelerated the download count and we're thrilled to see how many developers have already tried it out. If you want to know more about what is included in Open Exhibits 3, please see the announcement from last week.

Over the last three years, we've worked hard to develop a powerful and flexible HCI software framework. Of course we can't take all of the credit, we've had a lot of input and help from all of you in the Open Exhibits community. Your help in building modules, your suggestions, and your support have made this major milestone possible.

Open Exhibits has also received major support from the National Science Foundation (DRL#1010028). NSF backing and the support from the Informal Science Education (ISE) community at-large helped launch Open Exhibits with the idea that it could help transform how computer-based exhibits are developed in museums, planetariums, and other ISE venues. There is plenty of work still to be done, but 25,000 downloads is a milestone worth celebrating.

Thanks everyone and congratulations!

More Info

by Jim Spadaccini View all posts by Jim Spadaccini on Nov 21, 2013

Major Release: Open Exhibits 3

We are excited to announce the Open Exhibits 3 release which includes a number of significant and exciting improvements to the free and open HCI software framework.

The most fundamental change to the framework is in how Open Exhibits 3 works in 3D space. This improvement not only encompassed how objects are rendered with Away3D, but how motion gestures are analyzed behind the scenes for devices like the Leap Motion Controller.

Away3D Support
With the new Away3D support, three dimensional models may be authored in external programs like Maya, 3DS Max, Cinema4D, and Blender and imported into an Open Exhibits project through simple CML (Creative Markup Language) declarations. One of the exhibits we provide in this release includes a 3D molecule viewer. Please check out our 3D tutorials for interactive display objects to see all of the amazing things you can build quickly.

3D interactions and handling are done in the same manner that 2D interactions are completed, so previous methods of working with interactive display objects (CML) and gesture definition assignments (GML, Gesture Markup Language) have been preserved.  Another feature of 3D support is that any new 2D applications you create will automatically take advantage of the 3D rendering pipeline by representing your 2D objects as textures on 3D objects. This means that your exhibit will benefit from a performance boost, since rendering instructions are completely offloaded to the GPU.

Leap Motion Support
Open Exhibits 3 true 3D gesture recognition support opens up a whole new set of possibilities. With support for an inexpensive device like the Leap Motion Controller, Open Exhibits 3 expands the ways of interacting with exhibits. Along with support for 3D gestures, the framework includes visual feedback indicators that let visitors know where their hands correlate with the interactive display and what types of interactions are possible. Users can see how their hands can grab an object to manipulate it or gesture in mid-air for more information.

Combining the 3D interaction space with the 2D touch surface also opens up new possibilities for multimodal exhibits, where touch and motion can be used together.  The Open Exhibits 3 framework has significantly improved the accuracy and gesture recognition capabilities of the Leap Motion Controller. This makes it easier for developers to author exhibits, more so than by using Leap Motion’s API alone.

New Modules and UI Components
Open Exhibits 3 also contains a new set of very useful software modules. Highlights are: an HTML browser element that allows you to explore the web, Text-to-Speech and Speech-to-Text through Microsoft SAPI (for accessibility), updated Starling support, touch input recorder, Gesture Visualizer, and much more. Ultimately, we want to provide the Open Exhibits community with the best tools possible to create compelling experiences for visitors. Download Open Exhibits for free today and try it for yourself.

Open Exhibits 3 Features

  • Away3D support
  • CML support for Away3D scene construction
  • Leap Motion integration
  • Touch 3D gesture integration
  • Dozens of 3D motion gestures
  • CML primitive elements are now gesture-ready
  • Gesture Visualizer
  • Touch input recorder utility
  • Text-to-speech and Speech-to-Text through Microsoft SAPI
  • HTML element and viewer (load URL or write inline HTML within CML)
  • Paint element and viewer (export as SVG or PNG)
  • SVG element
  • Open sourced CML
  • Radial Slider
  • More examples
  • New CML state manager and improved internal State Machine
  • 100+ CML Tags
  • Improved CML selector support including DOM and jQuery-like $ selectors
  • State support for RenderKit
  • Enhanced input mode support
    • allow input modes on a per-object basis
    • automatic mouse filtering when mouse and touch events are doubled by OS
  • Load multiple GML files
  • New relative layout system for containers
  • Updated Starling support
  • LoaderMax integration
  • SlideMenu element

Open Exhibits Major Bug Fixes and Feature Requests from the Last Release

  • Object transformation boundary limits in GML
  • Individual object transformation boundary limits
  • Finger count settings for hold and tap in GML
  • Improved TUIO support includes FLOSC, TUIO via TCP, and TUIO via UDP
  • Improved virtual touch object workflow
  • Magnifier performance improved
  • Modest Maps improved

More Info

by Ken Willes View all posts by Ken Willes on Nov 13, 2013

Open Exhibits & CMME: Accessibility in Interactive Exhibits

Open Exhibits takes another step towards accessibility in the latest release with the addition of the text-to-speech and speech-to-text capabilities. With the new Open Exhibits 3.0 release, there is an example project that contains an interface to the Microsoft Speech Application Programming Interface (SAPI). This technology allows you to create applications that are accessible to visually impaired users:

  • Command your exhibit using voice commands by setting up a vocabulary of recognizable words and phrases.
  • Describe the contents of your applications using text-to-speech synthesis.

This initiative was made possible in part by the Creating Museum Media for Everyone (CMME) project, a collaborative effort of the Museum of Science, the WGBH National Center for Accessible Media, Ideum, and Audience Viewpoints. This project furthers the science museum field's understanding of ways to research, develop, and evaluate digital interactives that are inclusive of all people.

Future developments of the Open Exhibits + CMME initiative will make audio accessibility more streamlined within the SDK including:

  • Integration within the UX library.
  • Automatic voice-accessibility mode to more easily incorporate the technology.
  • Built-in navigation system for visually impaired users targeting multiuser, multitouch displays and environments.

View the CMME webpage for more information:

More Info

by Ken Willes View all posts by Ken Willes on Nov 13, 2013

HCI+ISE Conference Proceedings Available Online

The proceedings of the HCI+ISE conference, held in June, are now available online. You can read the complete document in the Papers section of the Open Exhibits site or on ISSUU. An accessible PDF version is in the works. Special thanks to Catherine McEver of The Bureau of Common Sense for documenting the conference so thoroughly.

More Info

by Nora Galler View all posts by Nora Galler on Oct 8, 2013
First  <<  1 2 3 4 5 6 7 8 9 10  >>  Last