interdisciplinary research studio-lab


Low Cost, Open Source Wireless Sensor Infrastructure for Live Performance and Interactive, Real-Time Environments


Project Description:

SENSE/STAGE will develop new wireless sensor and software tools that can be used to create innovative and compelling forms of interactive real time performance. This term encompasses live performances, from traditional theatre to new gaming environments that use sensors and computers to create dynamic and interactive scenography. Scenography is the creation of an immersive and total theatrical stage environment through scenery, lighting and sound.

Traditionally, the control of lighting, sound and mechanical scenery in live performance has been achieved by pre-programmed, cue-based software: all choices have to be planned out beforehand. Our main objective is to develop innovative tools to create a new kind of scenographic ecology where performer and stage environment mutually influence and impact each other. By understanding what is taking place in the stage environment by way of sensors, our software will be used to create interesting and surprising behaviours for light, sound and mechanical movement of scenery rather than controlling these in a predetermined way. The human performer can directly influence the system’s behaviour from his/her actions and movements. The performer’s movements or gestures can also impact the system indirectly over a longer time period causing the behaviour of the scenography to be surprising and unpredictable to performer and audience alike.

The project includes interrelated technical and creative objectives. The technical objectives include the development of new wireless sensor hardware and software tools to enable the creation of an interactive, scenographic ecology where performer and lighting, sound and mechanical scenery can mutually influence each other in unpredictable and surprising ways. The technical work in the project will be guided by the team’s extensive experience understanding the potentials and limitations of new interactive technologies used in live performance. Specifically, the objectives include three aspects:

1) Develop Wireless Sensing Devices:

Following an initial and extensive analysis of existing sensor technologies and their potential for live performance which will include collaborative experimentation sessions with team members and professional artists, we will develop an innovative wireless sensor infrastructure specifically designed for live performance conditions.

A series of small electronic circuit boards the size of large postage stamp will be developed which can be easily mounted on actors or dancers, scenery and props (see Figure 1). Each sensor processing board will be “plug and play:” directors, choreographers and scenographers will be able to easily hook up multiple sensors to the boards. These sensor boards will sense the movement, position and acceleration of an actor or dancer and relay this data between each of the boards wirelessly to “understand” and monitor what is happening around them. The custom electronics will be energy efficient: low power, able to run over extended periods of time (6-9 hours) and use rechargeable batteries.

In order to choose appropriate sensing devices for the wireless platform, we will work with professional performance artists in extensive experimentation rehearsals to explore a wide range of sensory information that can be used as input to our system. From this experimentation we will also specify a range of low cost sensing devices for the wireless boards as well as accompanying documentation with “scenarios for use.”

2) Develop Interactive System Software:

The second tool is open source/cross platform software that can intelligently control the behaviour of lighting, audio and physical scenery in live performance. To create such interaction, the software will implement current scientific research in dynamical systems, i.e. systems that change continuously over time, depending on the current input, past input and the internal state of the system. The sensors will be used to understand what is happening on stage. This sensor information will then be processed by the software and “fed” to lighting, sound and other devices to create potentially surprising behaviours. Interaction will take place at the level of performer and scenographic environment, with the software facilitating both direct and indirect interaction. Direct interaction will allow performers to play the stage environment like a musical instrument, while indirect interaction will allow the scenographic environment to evolve and behave on its own.

Artistic Results of Research:

Chronotopia – dance-theater work in collaboration with Attakkalari Centre for Movement Arts

Bangalore, India

Interactive Scenography

Chronotopia is a collaboration with the Bangalore, India-based movement arts center Attakkalari. A specially designed, wirelessly controlled lighting system utilizing technologies developed on the Sense Stage project is influenced by the rhythms and motion of the dancers, creating an abstract, pulsing visual stage environment for Attakkalari’s choreographic re-imagining of the ancient Tamil epic poem Silappatikaram (The Tale of the Anklet).

A series of 9 meter tall columns holds a series of 6 Cold Cathode Florescent Lights encased in acrylic cylinders which form a matrix of 36 individually controllable lighting elements. Additionally, three wirelessly controlled CCFL’s are mounted into pedestals which are carried around the stage by the performers and function as sacred objects in the work’s dramaturgical structure.

The stage scenographic environment consists of the lighting system in front of which is suspended a large scale, semi translucent projection screen upon which is projected a series of five iconic yet, visually abstract textures. Top and side cameras analyze the flow of motion from the dancers and are used to provide simultaneous input to both the lighting and projection systems. The lighting responds to the performers’ motion by either triggering set patterns, directly coupling with the dancers or creating abstract traces based on different rhythmic and temporal patterns.

The lighting scenography is inspired by both the abstract images of bodily gestures that are prevalent in traditional Indian dance forms such as Bharatnatyam as well as by the ubiquitous appearance of florescent lighting fixtures in everyday life in India.




Attakkalari India Biennial 2009, Ranga Shankara Theater, Bangalore, India, February 2009

Music Academy, Chennai, India, February 2009

Niraswaram, Heggodu, India, February 2009

European venues in 2010 in Goteborg and Frankfurt

Just Noticeable Difference (JND) – interactive sensory environment

JND is a 1.2 m wide x 2.5 m long x 3 m high portable sensory reduction environment combining near darkness with extraordinary low levels of tactile vibration, flicker-based, color changing light and sound at the thresholds of human sense perception. The installation is for one person at a time that lies down inside the space for 15 minutes on a custom developed floor. Micro-movements ranging from twitches of the muscles to larger body motions are measured in real time and affect the patterns and intensity of a composition of touch, light and sound. The installation explores the fluctuation between noise and order at the level of body perception and the resulting shifts of experience of self that might occur.

EMPAC (Experimental Media and Performing Arts Center), Renssalaer Polytechnic Institute (RPI), Troy, New York, March 2010
Other European venues TBA

Other projects utilizing Sense Stage technologies:



Arctic Perspective




Sense/Stage workshop





“Sharing Data in Collective Interactive Performances: The SenseWorldDataNetwork:” Submitted for 9th Annual Conference in New Interfaces for Musical Expression (NIME 2009), Pittsburg, Pa, USA

Other publications in developmemt

Funder: SSHRC (Social Sciences and Humanities Research Council)

Program: Standard Research Grants: Research and Creation

Dates: May 2007-May 2010

Collaborators: labXmodal (Concordia University) and IDMIL (Input Devices and Music Interaction Laboratory, McGill University)


  • Chris Salter (PI)
  • Dr. Marije A.J. Baalman
  • Harry C. Smoak
  • Brett Bergmann
  • Nicholas Munoz
  • Elio Bidinost
  • Vincent de Belleval


  • Dr. Marcelo M. Wanderley (co-PI)
  • Joseph Malloch
  • Mark Marshal
  • Joseph Thibodeau

Status: In Progress