Interactive Music – Rethinking the score
For my score realization I chose to focus on making generative music using ‘unconscious’ or rather ‘subconscious’ interaction. I started researching how I could use browsing history, to generate a score based on the user’s choice of online content. After reviewing the ideas in class, and getting paired up, explaining your idea to a partner I realized that this idea is still vague to me, and given the nature of the assignment I decided to choose a different route with an idea that clearer to me.
I have created a couple of visualizers for music in the past years and one thing that always attracted me was the concept of driving, visual effects, animations and visual occurrences by data generated from audio analysis and/or MIDI.
One thing that always intrigued me was the ability to ‘reverse engineer’ the audio reactive approach used to create visualizers into a something best described as ‘sonification of graphics’, essentially using graphics to generate audio.
For my realization I chose to focus on a graphical simulation system that generates the score. More specifically I used a ‘metaball‘ simulation to define the behavior of 3D sphere in space which is also responsible for generating the score. As I had a certain style I was aiming for, in terms of score I used samples that are played through a granular sampler using Tone.js that way, the generation of the score is to some extant pre-determined but the user controls the simulation still controls the texture and the turning on and off of the samples.
The result is a granular sampler based composition where the user gets to ‘spwan’ metaballs into the simulation which in return triggers different samples into the composition. The position of the metaball in space, changes the pan of the sound in the composition.