ITP Blog

My journey in making things

Sound Objects – Final

Towards the end of this semester I had the opportunity to mix two of my favorite classes into a single final, Code Lab 1 which was about programming for the Unity game engine, and Interactive Music, which as the name suggests, was about interactive music. This was also a great opportunity for me to collaborate with Scott Reitherman, as we have been talking about a continuing peace for his Ambient Machine project for a while now, and have been looking for a time to create a 3D virtual reality ‘big brother’ project, and this was the birth of Sound Objects VR.

At it’s core, Sound Object is a VR composition app, that let’s you compose music by augmenting physics of objects in space. The music is generated by object collisions, while the physics determine repetition and speed.

Prototyping the idea

We started by creating a really simple 3D world which let’s the user spawn new sound spheres and bounce them continuously to create musical patterns. This idea was our first working demo, and so I tested it in both Interactive Music, and Code Lab, and was surprised to realize that people found the interaction very playful, and were mainly commenting about additional experience elements, such as scenery, effects and composition changes. One of the main things that hit me while demoing in the midterm, was the power of spatial audio as a mixing tool, instead of having to mix the things for the user, if he get’s to walk around the sound emitting objects, he will intuitively mix it himself.

Beyond the midterm

After midterm, the main goals were:

  1. Work on the world (scene, graphics, effects)
  2. Work on the audio elements and the compositional aspect of the experience
  3. Implement the scene using a VR headset
  4. Figure out and refine the interaction

We started by changing the world, as our midterm environment was essentially a gray sandbox, we had to create the environment from scratch. After some brainstroming, and user feedback, we decided to go with a desert scene, in which you are surrounded by sand and mountains, which works well since you are familiar with the environment (i.e it’s not beyond conceived reality), yet it is very peaceful and minimal, allowing the composition to act as the main thing. We designed the terrain in Unity terrain plugin, and worked with E-on Vue, to create specific mountain geometries. We also used keijiro’s amazing HexBokeh shader, to add some depth of field to the scene.

Alongside getting the environment to work well, we continued to develop the sounds for experience, and actually developed a day-to-night scale transition, which we will implement in the future as a part of an arc story in the experience. The sounds all get loaded into a main static dictionary, which is shared between all the sound objects in order to play clips. This approach also eases the implementation of new sounds to just calling the buildSoundList method.

Another realization we had along the way, is we wanted to able to control the properties of objects that shared the same sound. For this, we added a SoundProps class which uses similar structure to SoundLists, and essentially stores properties which are then used by the objects in a later stage.

With the sounds in place, we were also working on implementing this world in a VR headset. Initialy we wanted to go with the Vive, but since we had access to more Rifts in ITP, we used that alongside the Oculus Touch controllers as hands. After some time learning the API, one thing we had to tackle right away was being able to walk in VR. We decided to use the joystick found on the left Touch controller, but the Oculus code only provided a method which requires you to calibrate the forward vector everytime you run you game, and so that was time to hack. To acomplish a fix for that I added a public declaration in OVRPlayerController script for the right eye camera, and use that one to create forward vector for the joystick, that way if you rotate your head, you also change the joystick controls. full script could be found here.

After testing many different interaction approaches, we decided the sound objects wont be spawned, but place in trays, that have similar composition qualities (i.e work well together), and the user would navigate in space, creating a big composition composed of three spatial areas of smaller compositions.

Demo, demo, demo, demo

Here is a video showing a demo we made in the Interactive Music final class

Here is a short making of video showing some of the aforementioned stages

Next up

We would like to continue and work on Sound Objects and deal with the following:

  • More compositional elements
  • Audio-reactive scenery
  • ‘Arc’ story for the experience that changes over time
  • Some effects for raycasting balls, and ball groups of the same sound

I would also like to thank Matt Parker and Yotam Mann for guidance, help and the knowledge each course provided, and in return the way it shaped the project – THANK YOU

Homemade Hardware – Final

 

For my Homemade Hardware final, I decided to continue and pressure the keyboard design. The keyboard itself, is highly influenced by the Roli Seaboard and it’s approach towards multidimensional midi controllers, but rather then an expansive, software specific solution, I wanted to make a ‘cheap-as-possible’ multidimensional midi keyboard controller.

The Roli Seaboard (small version)

Prototyping

I started prototyping the idea of how the keyboard would actually function and ended up using a dual sensor setup for each key, where the pressure is determined by an FSR (force sensing resistor) sitting at the bottom of each key, and your finger position is determined by a soft-pot at the top.

Once everything was working on the breadboard, I moved to Eagle to start designing the board that would read the sensors, and send them over serial to the Raspberri Pi, which would function as the synth, turning the data into either sound/midi commands sent to the computer.

Schematics

Bill of materials:

  1. Atmega328 micro controller
  2. 2x 4051 multiplexers
  3. 16mhz resonator
  4. resistors, capacitors and header pins

I created two boards, one prototype (through hole) and one final (SMD) boards. I started by laying down the parts which I would need, and position them, the two boards are near identical, just the parts (symbols) are different to match the different PCB techniques.

Board schematics

After making the through hole version I wasn’t able to find a 16 channel multiplexer in a surface mounted version, and so decided to use 2×8 channel multiplexers and use the same control pins, so they are chained to the same select pins coming from the micro-controller.

 

Left: through hole board design | Right: surface mount board design

Fabrication

I started with toner transfer, which went surprisingly well. My times were:

  1. Run the board with the vinyl print through the laminator 5 times
  2. Iron each side of the board for 4 minutes, keep constant movement (I listened to dub music which really helped set the down tempo loopy mood for ironing)

After toner transfer was done I acid etched the board which took about 25 minutes for 3 boards.

Assembly

After cleaning the boards from the remaining toner, I started placing the parts I would need to solder and created a solder stencil. One thing that really helped me smooth the solder stencil process was to get rid of unused pins on the Atmega328, which was quite straight forward in Illustrator.

Solder stencil after some Illustrator work

The settings I used for the laser cutter:

  1. Raster mode
  2. Speed: 10
  3. Power: 15

After some more dub music, and quality time with the pick and place machine I was able to get 4 boards soldered and re-flowed. Out of 4 boards, I got 2 working ones which is A LOT compared to my previous ratio when making boards with the Othermill.

Programming was a breeze thanks to this nifty little thing, and I was able to burn the boot-loader and upload my sketch in a matter of minutes, here is the test app I uploaded

After beep testing the boards I drilled and hand soldered all the header pins in place and tested my board with a second Arduino that reads the serial out from the board, and IT WORKS!

Prototype board

Final board

Special thanks to Shir David for help with shooting.

Up ahead

I am currently working on fabrication aspects of the keyboard, such as enclosure, soft key molding and general design thing and would like to continue developing this into a functional ‘multidimensional‘ keyboard.

Homemade Hardware – Week 08 Board

For this week’s assignment we had to make our very own acid etched SMD board. Since we’re getting closer and closer to the final, I decided it would be a good idea to start realizing the final project.

The project

I decided to try and build a two dimensional MIDI keyboard, yes, much like the ROLI seaboard, but different. To start, I realized this project would depend on my ability to plan one key correctly and then realize the full keyboard, so I started sketching how just one key module would look, work and function.

I started prototyping the key and decided to use two sensors per key:

  1. A linear variable resistor (for your finger’s Y position on the key)
  2. An FSR, to sense how hard your pressing down on the key

After getting it to work on the Arduino I went on to make the actual schematics using the ATTiny85 as my MCU, and attaching two status LED’s that would indicate how hard you pressed each of the sensors.

I took the time to fully brush my understanding of Eagle’s wireless networks so my designs could be modular and I don’t have to to decipher where lines are going once the project becomes bigger. For the MCU and I finished my design of the ATTiny85, we started in class, and adjusted its size a bit so it fits. After finishing the schematics I made the board and tried to tidy it up, so it’s small, but not too small (the taste of bad experience with small boards still remains from the Othermill).

After printing I went on to printing and making vinyl toner transfer sheets in order to start my board.

The toner transfer went pretty well, I realized out of 8 boards I designed only 4 came out right, all the rest had at least one issue somewhere. Due to that, it made no sense placing boards that are problematic in the acid bath (that would just take time for nothing), so I cut the rest using the band-saw and went to over to acid etch, with the help of the wonderful David Lockard.

After about 30 minutes of acid, I took the board out, let it cool in water, and started placing the parts

Drilling holes
One of the final boards

Interactive Music – Midterm

For the midterm project in interactive I decided to continue developing a project I started working on this semester with Scott Reitherman. We started the project from a set of meetings in which we discussed stochastic music creation approaches and our self interest in reinterpretation of how composition could be created using intuitive methods and tools, enabling essentially anybody (referring to prior musical knowledge that is), to intuitively compose.

As we started laying the foundations for the project with a Virtual Reality HMD in mind, I decided to focus on building a ‘demo’ scene for the experience which could emphasize, or rather outline our main objectives for the experience, as it Work in Progress

Realizing the gesture

One of the main aspects of the assignment was the use of gesture, and since I decided to use a virtual reality headset, I decided I would need a ‘3D agent’  to bridge the virtual and the physical. I decided to use the Oculus Touch controller, which conveys ‘hands like’ feeling during the experience.

The experience

The experience consists of a virtual world in which you get to spawn spheres, that on collision with the floor trigger sounds. The main focus from an interaction stand point, would be your interaction with the physics engine that is controlling the sphere’s movement in space after they are created. To demonstrate that point, I decided to create a physical model in which the bouncing spheres maintain they’re energy, or simply put, they bounce forever.

In the above examples spheres are spawned by the mouse’s X and Y coordinates on the screen at a fixed distance from the player itself. After realizing that part, I went on to the VR integration.

The demo above demonstrates the use of the ball spawning and also the ability to ‘pause’ the balls in mid air (which in return pauses they’re sound).

A link to desktop versions of the demo could be found here for both Win/Mac

What next?

  1. Adding interactions with the physical model enabling the user to further understand the connection between the pysical characteristics of the world and the composition he creats (e.g less gravitational force will make repetitions less often which will result in a slower overall composition).
  2. Sound selection GUI which enables the user to both change and audition different musical components.
  3. Figure out a way to deal with non-rhythmic sounds (e.g drones, pads, ambient components)

Homemade Hardware – Scale board

For this week we had to start prototyping week 8’s board. We had to decide on a concept and start prototyping the board and components for it. After some thought, I decided to either go for a musical instrument, or a weight scale. Given the time scope of the assignment, I decided to go with the weight scale board.

 

Load cells

Since I decided to go with a weight scale board, I started researching into load cells using Arduinos. I used Sparksfun guide to load cells and amplifiers and this Instructable to get comfortable with the idea of using a load cell. I ordered an HX711 amplifier breakout board and a 100 grams load cell. As the board came without header pins, I started by soldering header pins on the breakout board.

 

 

After soldering the the HX711 amp breakout board, I downloaded the HX711 library from the Arduino and connected the load cell to the amp board to the arduino and started measuring. Unfortunately, the load cell didn’t come with threads (screws) and has to be mounted before it could be used accurately because the measurement  hits the amplifier (I ordered 3mm threads so I can mount the scale between two laser cut panels I’ll cut).

Designing a board

To get started with designing the board, I found Sparksfun Github repo that has Eagle schematics for the HX711 breakout board, and so I downloaded it and started deleting parts I don’t really need for my board. I also inserted an ATTiny85 into the board got a basic header layout for power and ground, and for the load cell connections.

Initial board design schematic in Eagle

Homemade Hardware – The story of the R

For this week’s class we had to make an LED letter, that is controlled by a sensor. After some class debate, I got the letter R, and started working on the design. This blog post will recap all my process in the creation of my LED letter. To make it short and concise here is a rundown of everything that happened over the last week in the creation process:

  • I cut my finger 3 times
  • I broke a drill 1/32″ bit
  • I drilled 3 R’s, soldered them entirely and only then realized I had a problem
When I came out of the class the first thing I did, was to cut last week’s board (kind of getting my feet wet before I start cutting this week’s assignment). This process went surprisingly smooth (which says nothing about what’s about to come next).

Design Process

I started by designing the R, I wanted something classic but also the ability to fit all components nicely without having to struggle with getting everything on top of it.

One important thing I realized during the making process is that it’s probably better (at least for my soldering skills) to use an endmill, and make very wide traces, 1/64″ in picture above.

Fab, Fab & Fab

I started by using an engraving bit, which in return led to some really nice looking circuits, but that was literally impossible to solder, plus got me injured while putting the bit on, yes it’s sharp who would have thought so right?

After 4 versions of the same R I was able to get it right, using the 1/32″ end mill bit and with a ton of patience. One thing this homework has thought me well, was to solder, as every single one of the try outs was actually soldered to completion. Some of the reasons it didn’t work on the first tryouts are me using engraving bit, using a double sided board (both sides are conductive, I know now) and poor soldering skills which led some of the boards to essentially not work.

Coding and making it light up

I decided to use a pot to drive the LED’s. instead of having it turn the LED’s on and off, I decided to continue with the breathing effect from the previous week, and add an off feature when the chip reads the pot value to be less then 10, all other values control the ‘breathing speed’ of the effect. Here is how the part that controls the LED’s look:

Demo

Here is a demo video of the R in action

Interactive Music – Rethinking the score

For my score realization I chose to focus on making generative music using ‘unconscious’ or rather ‘subconscious’ interaction. I started researching how I could use browsing history, to generate a score based on the user’s choice of online content. After reviewing the ideas in class, and getting paired up, explaining your idea to a partner I realized that this idea is still  vague to me, and given the nature of the assignment I decided to choose a different route with an idea that clearer to me.

The Score

I have created a couple of visualizers for music in the past years and one thing that always attracted me was the concept of driving, visual effects, animations and visual occurrences by data generated from audio analysis and/or MIDI.

A web audio visualizer

One thing that always intrigued me was the ability to ‘reverse engineer’ the audio reactive approach used to create visualizers into a something best described as ‘sonification of graphics’, essentially using graphics to generate audio.

The Execution

For my realization I chose to focus on a graphical simulation system that generates the score. More specifically I used a ‘metaball‘ simulation to define the behavior of 3D sphere in space which is also responsible for generating the score. As I had a certain style I was aiming for, in terms of score I used samples that are played through a granular sampler using Tone.js that way, the generation of the score is to some extant pre-determined but the user controls the simulation still controls the texture and the turning on and off of the samples.

The result is a granular sampler based composition where the user gets to ‘spwan’ metaballs into the simulation which in return triggers different samples into the composition. The position of the metaball in space, changes the pan of the sound in the composition.

Homemade Hardware – Week 02

For this week’s homework, we had to finish soldering our chip programming shield for the Arduino, and design our circuit with Eagle. I started by practicing soldering, time and time again (and got burned multiple times in the process), and after finishing the shield we started in class it looked something like this

Shield with ATTiny85 socket for the Arduino Uno

During the weekend, I got my Soldering station delivered (thanks Amazon), and decided to re-watch the videos in the lab, plus the NASA soldering tutorials and practice more. To do so, I used an Arduino Proto shield, and soldered two sockets, one for the ATTiny85, and one for the 328, so I have scaling options in the future.

As these proto-bords are two sided, I kept most of the wiring on the bottom, resulting in a clean shield that is easy to carry plus put on and off. One thing I would improve in it, is having female headers on the I/O, so that I can keep the shield on while I use the Arduino as a power source for other circuits. Even though it could use a couple of further mods, the shield acctully saved me a lot of time in programming the chips, since using it feels more like a ‘plug and play’ instead of reorganizing the breadboard whenever something doesn’t work.

After the soldering, I started practicing Eagle in order to create the board for the next class. I actually found the workflow of typing commands to Eagle quite meditating after you dive into a long board design session (with the headphones on), and discover a couple of hours have passed (Oh no). As discussed in class I started with the Schematic view, and was looking for a ATTiny85 schematic symbol online, but all the ones I found wer’e surface mounted, and as we will be covering that topic down the road, decided to use a standard 8-pin socket, which in terms of holes will fit perfectly with the ATTiny85 socket.

My circuit in schematic view in Eagle

After thinking about how the sensor would connect, I suddenly realized since it has it’s own logic board, it would probably make most sense to have headers that would connect to the appropriate place in the circuit but would allow me to mount the sensor’s logic board in a different location (Nice! now to the board design part).

Board view for my circuit

After trying many different configurations, I landed onto this one which is able to fit all the electronics needed in the circuit into a rather small form-factor, but giving appropriate space between lines. One thing I did realise while designing in board view, is that my VCC and GND, were just symbols, and meant I didn’t have any way to ‘wire’ them up, and so I had to go into Schematic view and create headers where power and ground actually connect to the circuit.

Hello Homemade Hardware – Week 01

After this semester’s round of confusion, class swaps and drops I joined the Homemade Hardware class. Prototyping, building and thinking about logic in hardware, was something I really enjoyed doing last semester, and so is seemed like a direct line, in which I could keep on pursuing knowledge in ‘hardware-land’. Enough talking, off to building!

I started by reviewing some of the labs, and essentially working out my soldering, drawing and ‘Arduino’ing’ skills that have faded a little since winter break started. I chose to use the ATTiny85 chip, and decided I would use moisture sensor to drive two status LED’s, one that shows whether there is enough moisture in a plant’s soil, and one to show that it needs more water. After running and the boot-loader, and following the tutorial on programming the ATTiny using the Arduino as the ISP, I was able to get the Blink demo sketch running (YAY, great success, well kind of).

Blinking LED using the ATTiny85 chip

After getting it to work, I decided to test my setup, and therefore used the Arduino to program a small applications that reads a moisture sensor on analog port 1 (A1), and creates a smooth ‘breathing’ LED effect on one of two status LED’s depending on the readings.

After getting the applications to function appropriately and measuring moisture sensor using the Arduino’s serial monitor, it was time to build the circuit using the ATTiny85.

The design for the circuit I wanted to build

After all that was in place I only had to implement the code using the correct naming conventions for the ATTiny85 chip I/O, which sounds trivial but it did take me a reasonable amount of time to figure out, implement the code using the Arduino as an ISP and assamble everything on the breadboard.

Here is a video demonstrating the final circuit doing its thing

“This song ain’t my song” – Manifesto

Growing up I found my way into the musical world through classical piano studies and down the road percussion and drums. Later on, I found myself cleaning and operating a rehearsal room, which was a dream come true, I get to work in helping people create music? YES, YES and YES!

After some time, I started learning cinema, and shifted my interest in music into a broader, interest of sound in cinema and media, which would lead me to experiment with programming, audio-reactive art and also write a thesis paper about interactivity, listening modes and the similarities between cinema and interactive experiences from a sonic point of ‘listening’ which I ended up naming “The choices we hear”, original right?

*Disclaimer: I do have tremendous respect for the craft of writing, producing and recording music, everything mentioned in this post refers to my ‘five-cents’ on personal aspirations for experiments in music creation, publishing and distribution.

So what’s wrong?

From a creation, publishing and distribution point of view, even though all three topics mentioned did transform themselves as the web developed (just one example out of many), the points listed below are still things I personally think are worth investigating

  • To some extent, we have lost fandom, artwork for instance, hasn’t been ‘reinvented’ to accommodate the new mediums available.
  • Even though a major portion of music is consumed in a streaming model, it still is a one-sided dialogue, the listener has no ability to affect, control or ‘personalise’ his listening experience
  • A general note, is that since these digital distributing services are branding themselves as the messenger, I personally feel that the gap between the artist and listener is actually getting bigger, “it’s just a catalogue, look me up”.
  • One major point from a creation point of view is that it seems one of the bigger trends in music creation has been emulation. By emulation, I am referring to analogue and dynamic processors. While I do think this uprise has brought many interesting and well-made tools, I rarely get to come across experimental tools, which aim to break the paradigm of ‘music-making’, whether it be algorithmic composition, experimental sound design or bizarre sound processing, these are way out of the main stream leaving little-to-no financianal reason for developers to investigate these options.

Sound particles is a prime example of one of the few experimental and interesting sound processing and design tool. The Roli Seaboard and the therevox (http://therevox.com/), are prime examples of a wild reimagination of the fundamental concepts of a keyboard.

So what’s next?

During this semester I would like to create musical and sound experiments using the following as guidelines

  • Inclusion, Inclusion, Inclusion – use the web for what it’s good for, making things accessible to the mass. Design, implement and code things so they could be intuitively used, music COULD be for everyone.
  • Avoid sticking to the creation<->publishing<->distribution paradigm, the experiences should mix these elements into one cohesive piece, it’s creation is a part of its distribution
  • Continuing point one, examine how music could be made using different inputs rather than music theory knowledge (continue walking the line if my ICM final – Forever)
  • Examine possibilities for sound creation and shaping using a 3D  (the opposite of audio-reactive, maybe graphics-reactive synthesis?)
  • Try getting good sleep, because it helps explain yourself to yourself

With this tone in mind, I look forward to a semester full of sonic experimentation.