ITP Blog

My journey in making things

ICM – Lesson#2

For the second lesson of ICM, I decided to tackle a music visualiser. After many thoughts I ended up creating a 3D music visualiser, that analyses the sound – using Web Audio API and controls a noise function applied to a 3D sphere using Three.js and a GLSL shader.  Let’s dive to it!

I would like to thank Ian McEwan who wrote this shader in the attached example, which really helped me in figuring my way into this visualiser. And a huge thanks to Marco Guarino (who’s also an ITP student and a great friend).

  • First step – Get the audio analysed:

I start by creating a web audio context, which enables me to create an audio element and hook it up to an analyser. The analyser outputs it’s frequency data into an unassigned array in which I will cover later on, as it happens in the update function for the whole scene.

//Web Audio Stuff//
var audio = new Audio();
audio.src = 'marco.mp3';
audio.controls = true;
document.body.appendChild(audio);

var context = new AudioContext();
var analyser = context.createAnalyser();
freqByteData = new Uint8Array(analyser.frequencyBinCount);
window.addEventListener('load', function(e) {
 // Our <audio> element will be the audio source.
 var source = context.createMediaElementSource(audio);
 source.connect(analyser);
 analyser.connect(context.destination);
 audio.play();
}, false);
  • Second Step – Make the 3D scene:

The 3D scene is actually a lot simpler than it might look at first. I started by rendering a 360 equirectangular image of the terrain using Vue (i.e an environment modelling suite), which I rendered in two resolutions 3840×2160 for the background, and 1280×720 for the projected reflections on the actual sphere.

plate_1_projection

This image get’s wrapped on a sphere as a texture using [Three.js]’s THREE.ImageUtils.loadTexture and as a part of the sphere element fed into the shader for displacement based on Perlin’s noise function. The last element in the scene is a very simple perspective camera that’s being moved by mouse down events around the object.

  • UPDATE, UPDATE & UPDATE:

The update function plays a major roll in every WebGL scene but also serves here as the containing function for the audio analysis on each draw cycle.

function update() {
 requestAnimationFrame(update);
 analyser.getByteFrequencyData(freqByteData);
 var length = freqByteData.length;

 var sum = 0;
 for(var j = 0; j < length; ++j) {
 sum += freqByteData[j];
 }
 aveLevel = sum / length;
};

By using a for loop to iterate over the frequency data and divide it’s length by the sum of each frequency’s amplitude I determine average volume, but can also extract singular values for specific frequency ranges.

screen-shot-2016-09-21-at-21-37-13

Lastly, I Included a dat.gui interface, as both means to learning to use it, but also as a set of controls for the shader’s response to sound, and even the camera’s reaction to sound.

The link to the visualiser’s repo & demo is https://juniorxsound.github.io/music_visualiser/

In the future, I would like to get better grasp of GLSL and shader development in the future to gain more confidence in perhaps writing my own shaders for these types of experiences.

Hi I am Or, I am a director, developer and artist. My current interest in research is sound interaction, computer vision & immersive media development.

Leave a Reply

Your email address will not be published. Required fields are marked *