Over the course of 2014, I was part of team that developed a mutli-touch musical instrument for a four story interactive cube. The instrument was part of a Red bull at Night project, led by Heather Shaw and Vita Motus, that transformed a downtown Los Angeles rooftop into a larger-than-life multimedia experience, inspired by the exponential growth rate of technology combined with the creative explorations of futurist theories.

Using the musical programing language Chuck, we created a custom software application that received 80 to 100 individual touch points as OSC data. This data was then filtered by location, and fed to four separate custom Reaktor instruments. Each of these instruments was developed to allow people to have an meaningful and expressive musical interaction with the instrument, without requiring any previous musical training. Additionally, the instruments needed to provide enough complexity to keep engaging the users, while also making the connection clear between the user's action and the sound they created.

To make all of this possible, we worked with Myer Sound and John Baffa, to create a system of 20+ speakers around the cube. We were able to use the location of each user to send the synthesized sound to their exact location. This allowed us to create a variety of instruments, including a sequencer that would "spin" the sequence around the cube, activating the sound when it encountered a users outstretched hand.