This initial stage was to trial the proposed technology. I wanted to test the responsiveness and range of the hardware and software with a contemporary dancer and martial artist by creating an environment similar to the proposed concept, Mashup Realities.

The hardware included Microsoft’s Kinect sensor, a laptop computer and a projector. The software, Synapse Kinect, developed by programmer Ryan Challinor, is used to capture motion and control the audio via Ableton Live, music software for production and performance. Simply, this set-up created an environment where specific gestures trigger notes, sounds and beats.

We were working in a studio approximately 5x5m. The sensor was placed to the front to one side and the computer screen was projected onto the main white wall. The software captured the person’s movement via the sensor and visualised the figure as a red skeleton or wireframe. Each point of the wireframe (10 of them; Head, Torso, Elbow (L), Elbow (R), Hand (L), Hand (R), Knee (L), Knee (R), Feet (L), Foot (L)) was measured and mapped as constant streaming X, Y, Z positions. This data was fed into Ableton Live, which then manipulates the audio. For example, a left hand held high above your head will play a high note. Move that hand done below your hip and a low note is played. Move sharply towards the sensor will trigger a beat.

The visualisation was smooth and instant, while the audio was hit and miss. There were moments when we lost the wireframe and needed to recalibrate. It appeared the sensor can be confused when one performer crosses in front of another. The sensor’s range wasn’t a problem within the space we were working in.

Thanks to Seeta Indrani, Gene Thai-Low and The Post Factory London.