Open Exhibits - Research - Papers



When Multi-touch Meets Streaming


With the advent of mobile devices with large displays, it is intuitive and natural for users to interact with an application on a mobile device using multi-touch gestures. In this paper, we propose that these multi-touch gestures can be streamed on-the-fly among mul- tiple participating users, making it possible to engage users in a collaborative or competitive experience. Such multi-touch streams, featuring very low streaming bit rates, can be rendered on receivers to precisely reconstruct the states of an application. We present the challenges, system framework, embedded algorithm design, and real-world evaluation of TouchTime, a new system that has been designed from scratch to facilitate the streaming of multi-touch gestures among multiple users. By seamlessly combining local computation on mobile devices and services from the “cloud,” we explore the design space of suitable mechanisms to represent and packetize multi-touch gestures, and of practical protocols to trans- port concurrent live multi-touch streams over the Internet. Specifi- cally, we propose an auction-based reflector selection algorithm to achieve the minimal end-to-end delay in a live multi-touch stream- ing session. To demonstrate TouchTime, we have developed a new real-world music composition application — called MusicScore — using the Apple iPad Programming SDK, and used it as our running example and experimental testbed to evaluate our design choices and implementation of TouchTime.