ITP Blog

My journey in making things

Interactive Music – Midterm

For the midterm project in interactive I decided to continue developing a project I started working on this semester with Scott Reitherman. We started the project from a set of meetings in which we discussed stochastic music creation approaches and our self interest in reinterpretation of how composition could be created using intuitive methods and tools, enabling essentially anybody (referring to prior musical knowledge that is), to intuitively compose.

As we started laying the foundations for the project with a Virtual Reality HMD in mind, I decided to focus on building a ‘demo’ scene for the experience which could emphasize, or rather outline our main objectives for the experience, as it Work in Progress

Realizing the gesture

One of the main aspects of the assignment was the use of gesture, and since I decided to use a virtual reality headset, I decided I would need a ‘3D agent’  to bridge the virtual and the physical. I decided to use the Oculus Touch controller, which conveys ‘hands like’ feeling during the experience.

The experience

The experience consists of a virtual world in which you get to spawn spheres, that on collision with the floor trigger sounds. The main focus from an interaction stand point, would be your interaction with the physics engine that is controlling the sphere’s movement in space after they are created. To demonstrate that point, I decided to create a physical model in which the bouncing spheres maintain they’re energy, or simply put, they bounce forever.

In the above examples spheres are spawned by the mouse’s X and Y coordinates on the screen at a fixed distance from the player itself. After realizing that part, I went on to the VR integration.

The demo above demonstrates the use of the ball spawning and also the ability to ‘pause’ the balls in mid air (which in return pauses they’re sound).

A link to desktop versions of the demo could be found here for both Win/Mac

What next?

  1. Adding interactions with the physical model enabling the user to further understand the connection between the pysical characteristics of the world and the composition he creats (e.g less gravitational force will make repetitions less often which will result in a slower overall composition).
  2. Sound selection GUI which enables the user to both change and audition different musical components.
  3. Figure out a way to deal with non-rhythmic sounds (e.g drones, pads, ambient components)

Hi I am Or, I am a director, developer and artist. My current interest in research is sound interaction, computer vision & immersive media development.

Leave a Reply

Your email address will not be published. Required fields are marked *