Forever – ICM Final Project
My ICM final project started the same way many projects do, brainstorming, erasing, writing again, rethinking and feeling confused. Given my past and current interest in sound, and the fact I took ‘Software Synthesis’ class this semester in Music Technology dept. in Steinhardt, NYU, I was fairly interested in challenging myself in the realms of synthesis, sound and composition. And so ‘Forever’ was born (only conceptually of course).
And so, the first step was getting the idea nailed down, what is it? how does it work (e.g how does it sound)? and what does it require to know, that I might have to pick up as I build it.
I decided to build a multiplayer web app that uses the user’s GPS location, with all other users locations already connected, to generate a collective musical composition. Essentially, I wanted this project to ask questions such as:
- What feels like a good balance share in the impact of the users versus the server in order to generate a composition? How will this balance translate to sound?
- What are components that signify progression in the piece?
- What data is meaningful data in respect to this composition?
And that’s the point where I think about Michel Foucault for a while and start sketching…
Some references that are worth mentioning are:
- Iannis Xenakis – and his ‘stochastic’ compositions
- David Cope – and his writings and compositions which include algorithmic composition and artificial intelligence
- Musique concrète compositions
- Musimathics book
The first thing I realized is that I know nearly nothing about server side programming, web sockets and server to client communications, and so started learning socket.io, express and programming my own server. I started with a very basic express node server
After building my first express server, I moved to sketching some client code that gets the client’s current position and logs it on the page (which later was transmitted back to the server via web sockets).
After getting these two to work, and with help of Dror Ayalon, I started sending the server each client’s GPS.
This later led to a full week of map projections and Mapbox API integration, getting the map, mapping latitude and longitude to x and y on the canvas and finally drawing icons for the users themselves on a separate p5.js canvas that gets overlayed on the map.
Musical interpretation of the data:
Once I got the user locations I divided the screen’s height into 5 zones, my musical scales are stored in 15 slot arrays therefore, each zone in the map has a range of 3 notes from which it can choose.
This was built in a way that allows to change scales by only changing the input in changeNote(); function which sets the notes for the user based on their ‘zone’.
From this point on it was all about building the functionality in a flexible way that allowed me to change things instantly and test them again rapidly. For instance the styling of the map went through a couple of revisions.
I decided to use p5.js for the drawing due to its flexibility and ease of use. Using p5 I draw a second canvas which is used for the cursor and user graphics. This canvas is also used for interactions such as cursor control with the mouse (or touch) and looping feature.
<<video coming soon>>
The repository for the project containing all code written can be found here
The live link for the website could be found here