Basis for this song was a generated music score based on a realtime video feed. I wrote a Processing sketch which translated the video images into Midi notes. A second Processng Sketch used the final song as input for the particle driven visualisation.
The live video feed to Midi Processing code (V1.5.1) and Ableton Live (V9.0) setup can be downloaded here:
Audio reactive sphere. The audio is split up in various frequencies which controls the length of the tubes.
My physical mushroom object, approx. 1m tall, recorded 40 times and translated into the virtual real time environment.
The ring moves based on realtime audio input. The colors and the size of the shape is also controlled in realtime via Touch OSC on iPad.