top of page
research & development

Gesture Based Show Control:

Handing Over Control to the Performers

 

Sherman Oaks, California

Real-time show control has always been one of the key forces driving R&D efforts at b/i. Traditionally show control lies with the stage manager or the board operator. But what happens if this power to control the show is given to the performer? The beginning of our quest to find answers to this question date back to 2001 and led us to the development of Diva and many of the sensored props and technology discussed elsewhere.

​

If we want to create an interactive, performer-driven system, we must quantify movement, and that means not only sensors but math to recognize what the performer is doing, and show control tools to trigger various design elements and in-theater effects. Sensors and microncontrollers have evolved considerably over the years since we started working with embedded electronics. Robust sensors, faster processors, wireless radios, and dime-sized microcontrollers can give us a really good idea how a performer (or a guest with a sensored prop) is moving at any given moment and over time.

​

Today we have many embedded sensing options to sense every joint, limb, and movement. But given our roots in live performance, we know that weighing down a performer with a bevy of sensors is not only cumbersome, it changes how they move and physically interact with their audience. Motions can become awkward and calculated, and performances become rigid. This presents us with an interesting challenge: how much information can we derive using a minimal number of sensors; and can we create a system that recognizes natural gestures and movements in order to trigger show and story cues? Our answer to this evolved into our gesture control ecosystem.

DSC_0745.JPG
b/i Gesture Control Ecosystem

It is amazing how much movement, gesture, and behavior you can derive from the motion of a single hand. So the answer to the first question above was the creation of a gesture glove. The glove has bend/pressure finger sensors and a sophisticated motion & orientation sensor. With this simple combination of sensors on a single hand we track over 40 gestures. Variations of the glove sensors have been placed on wrists and hand-held props as well.

 

The glove and sensors are obviously only part of the picture. Other elements include the gesture microcontroller which processes the sensor data and derives the gestures, a gesture visualizer, and a show designer which helps designers correlate gestures with cues.

Gesture Micro, Viewer & Controller

The b/i gesture microcontroller is the heart of the b/i gesture ecosystem. It is about half to a third the size of typical wireless mic transmitter and is designed to be stealthily embedded in costumes.

 

The Gesture Micro Viewer & Controller app serves multiple functions. In the wild it quickly lets us verify and calibrate sensors just prior to performers going on stage. During shows it gives the ability to verify gestures and show flow as well as remotely trigger show beats or scene changes. In rehearsals it helps in training the performers on gesture patterns and show structure. The software logs gestures, records data for new gestures, even takes the performers to "gesture school" to help them learn the nuances of each gesture.

Gesture Show Builder

The Gesture Show Builder is an app that gives you the ability to match gesture, beat, and story to show elements (e.g. cues) and have it all reside on the gesture microcontroller. And we know our audience: the app completely eliminates the need for any programming knowledge. All you need is an idea for a show, some light and sound cues, and you can be up and running in less than an hour.

bottom of page