Connecting motion to data
So you want an object/prop to move through space, and you want that movement to affect sound….how many different ways could you do it?
These are the types of questions we ask ourselves when playing with technology to tell a story. BAGEL and I are constantly asking ourselves – what is it that a certain piece of technology can do that makes it unique/different to the capabilities of a human theatre technician? We consider both roles quite important, performative and feel that despite all the hullabaloo about things like VR, AI and full automation, culturally this is where we are currently; human + machine = potential new experience. But I’m straying off the point (this is the kind of thing that makes us super excited – there is so much to talk about!)
We had the privilege of working with motion capture at ICAT in the Moss Arts Center last summer. This was a great opportunity to track an object in space and translate its movements into sound. As an artist learning about motion capture, the way I describe it is you mount a series of video cameras around a room that are programmed to look for a specific thing. This specific thing is a marker, a little plastic ball covered in a reflective grey material that the cameras pick up. Then the cameras use a program to feed that information to a computer, and then we use MAXMSP, a visual coding language, to tell the computer what to do with that movement information. In real life as an audience member, this resulted in a pile of floating balloons with little grey balls hanging on their ribbons, and when you pulled the balloons down to you, the sound of rain falls from the ceiling, getting louder the closer the balloons come to you and fading away up into the sky as you let them go.
This creative process was both hilarious and frustrating. Sometimes the cameras have a hard time reading the markers or sometimes the data isn’t being received correctly. Sometimes we struggled to find the right instruction for the computer to do what we hoped it would. Just as the physical feel of a balloon affects an audience member in a show, so does it affect the technology. Things like the material the balloon was made of, the colour, the weight of the marker balls and how they were attached to the balloon; all these little physical things add up to make the communication from audience to object to computer to sound successful or not. For all the times we spend fixing or calibrating, there are just as many moments of magic. Sometimes you end up with happy surprises too, like when we tried to get a sound to follow us in the room when we are holding a balloon but instead it stayed at the opposite end of where we were. New games and new images pop out of surprises like this.
A CHEAPER SCALABLE OPTION?
Because not everyone has a motion capture system in their theatre or arts center, we decided to investigate other forms of object tracking technology and that led us to the Pozyx system, a local tracking tech that can theoretically measure an object’s movement to within 10cm. The cameras are replaced with wide band radio receivers, or anchors, that we can place along the edges of a room. These read a ‘tag’ and can send position, acceleration and orientation data (among others!) to our computer with MAXMSP and Ableton. We’ve had some great initial tests with this and have invested in our own kit. (Big shout out to fanSHEN, a great interactive company and good colleagues who supported us with our tests). The beauty of this system is that IT’S PORTABLE which means we can bring our projects to more people and places 😀
Whether your a die-hard tech geek, a hesitant artist who scratches their head at coding or anyone inbetween – stay tuned for updates on our adventures using these tools in (hopefully) new ways! 😉