Thorne Brandt

Video Drums

Animation Driven by Acoustic Drums Triggers

I created a toolset for live animations to be triggered by any MIDI source.

I found this to be particularly engaging when used with acoustic drum triggers. I used an invitation to perform at Brain Frame as an opportunity to showcase this project.


More experiments were presented at Roots and Culture. Something I learned from presenting this work as a drummer is that the animation was too heavy on every aspect of the percussion. I'm a sloppy drummer so I believe the sublime synesthesia that I was aiming for never elevated above party trick. I believe the solution is to make the content represented by the drums be accents while the main content is responsive to melodies and general tempo of the song.


I also learned that I needed to build an interface for creating these projects. Currently the composition of the "songs" takes place within a json file, which is a start, but a "level-editor" is in production.

				
{
    "nextPartCC" : "51",
    "nextPartChannel" : "1",
    "defaultMidiChannel" : "2",
    "parts":[
        {
            "id" : "0",
            "name" : "sample",
            "objs" : [
                {
                    "keyStroke": "1",
                    "midiNote": "36",
                    "name": "pizza/flying_pizza.png",
                    "direction" : "right",
                    "scale" : ".8",
                    "y" : ".2",
                    "scrollX" : "-40",
                }
            ],
            "particles" : [
                {
                    "name" : "drumsticks/exploding_drumstick.png",
                    "keyStroke" : "2",
                    "type" : "1",
                    "midiNote" : "35",
                    "numParticles" : "50",
                    "numInstances" : "12",
                    "z" : "10"
                },
            ]
        }
    ]
}
				
			

I have setup the project so that assets can easily be swapped in. The parts of a song, types of animation are instantiated by parsing json. Each entry in the json creates a pool of offscreen instantiated prefabs that wait their turn to be triggered by their respective MIDI or keyboard input.

If you're interested in more involved customization for your own MIDI triggers for your personal audio visual performance and artwork. I am available for consulation. Please contact me.