pcomp

All posts in this category.

DMS – Physical Computing & Csound Final

For my Physical computing final project, I initially started by sketching ideas for things I am interested in making (more on this in this post). After some thought, I decided to mix two courses that really inspired me this semester, ‘Physical computing’ in ITP, and ‘Software Synthesis’ in Music Tech dept. of NYU. After brainstorming ideas (special thanks to Dror Ayalon & Roi Lev for that), I decided I would like to reimagine two concepts I am fairly interested in and touch both courses: Synthesis  and Modularity, and that’s how DMS – Different Modular System was born (at least conceptually).

Sketches of early stage ideas for the project
  • Conceptualisation:

I started by examing the features I think make modular synthesizers (and modular systems in general) powerful, both in terms of synthesis and interaction, alongside they’re downsides, and so the list began.

From http://gizmodo.com

Advantages:

  • Flexibility – modular systems are, by definition, a flexible instrument as they tend to allow more than a single configuration with the same components. This manifests into the audible realm too as you can process signals using the same components in different order and get different sonic outcomes (e.g filtering a signal before the delay vs. filtering after).
  • Interaction – modular systems tend to include more interaction from the user’s side, which can and at times does translate into a sense of customized ownership over the device. Simply put, rather than a synth, ‘it’s my special recipe using the synth’.

Disadvantages:

  • Terminology – This is a point that I’ll talk more about in regards to user play-testing, but it is worth mentioning at this point that the synthesizer arena in general and the modular one specifically, tend to overcomplicate terms to (at times) they’re mathematical and electrical origin, which sounds cryptic to most people and distances some users from even trying these systems.
  • Form Factor – As powerful as modular synthesizers are, they tend to be rather big installment devices. Thinking about these machines in a portable

*It is worth mentioning the list presents my perspective and does not imply that the current state of modular synthesizers is wrong or not valid, rather me trying to reimagine it in a different way.

Some of the references I used conceptually were:

  1. Little Bits Korg edition – a ‘bare boned’ Modular synth kit that is meant to serve as both an educational and musical modular system.
  2. Palette Gear – Modular controller eco-system for software and MIDI controller
  3. Korg MS-20 Mini – Semi modular synth ‘do it yourself kit’.
  4. Moog Werkstatt – Another semi modular synth ‘do it yourself kit’
  • Design:

This project imposed some big fabrication challenges for me, and due to that I decided to start the ‘making’ process with design and fabrication of the units.

Magnets, magnets and magnets!

Upon play-testing and discussing the project with both Danny Rozin and Ben Light, I decided I would start with building 2 modules that work, the main hub (i.e ‘the Brain’), and a second effect unit (i.e ‘the Mouth’). On the first Iteration I had around 6 magnets on each of the boxes, and after a Eureka moment decided to have an Arduino in each box, and let them communicate over serial. The initial bill of material looked something like the following:

I started the process by cutting acrylic top panels and leaving the exact diameter of the magnet that would later on be glued to the top. I used the protective plastic rings that came with the magnets (i.e separators), to elevate the magnets to 1mm above the plastic panel in order for them to always connect without disturbance (thanks Ben Light).

Final drilling of the cables soldered to a spring that pushes against the magnets

And finally, I soldered the wires onto springs that push against the glued magnets and pass the electrical energy. One note on this, is that as Ben Light mentioned to me, soldering directly to the magnet tempers with the magnet’s electrical properties and therefore might introduces unknown interferences to the electrical signal, the spring method however works fairly well, just remember to use epoxy.

As I mentioned before, terminology is something I really wanted to simplify into icons and language people feel comfortable interacting with (or at the very least doesn’t scare them away), and so for the synth’s interface I had multiple iterations, each time changing the icon or the text, and letting people react to that, this is how the final interface looks before the actual cutting and etching process

Last but not least, here are the two units after the finished fabrication process

Physical computing process:

The main reason for choosing the Arduino Mega for the brain module was it’s ability to both communicate using serial communication to the computer (in my case a Csound application), and use the additional serial ports (16/17), to communicate to another smaller Arduino in the mouth module.

The code below demonstrates the communication function on the brain module for sending the events to the computer. I use single byte messaging and divide 0-255 to all the synth parameters

A feature that really helped me and is both related to fabrication and implementation was putting a serial switch in the 2nd module that way I can change between Module-to-Module communication and sending new code to the 2nd module without disconnecting the serial lines.

  • Building the synth:

As this is my final project for both Physical Computing and Software Synthesis I used Csound, a synthesis library and engine to build all the actual sound generating logic.

I started by laying down the communication logic, the serial port, using Csound’s serialBegin and serialRead opcodes (the Csound name for logic functions built in the library).

After laying down the serial communication I started building the oscillators using Csound’s poscil opcode while storing the actual wave tables values for the waveshapes in an ftable.

During the making of the actual synth I discovered some functionality that I was really missing, for example the Arduino’s map(); function didn’t have any sibling on the Csound side, and so I challenged myself to extend Csound and build two mapping functions I used to map the Arduino serial coming in to different parameters in the Csound instrument I created.

 

After a lot of explanations, here is an image and a demo composition using the final synth

This is my final composition made entirely with the synth using Ableton Live as a looper and a MIDI controller hooked up to the synth.

  • Links:

Link to the Arduino Mega and Uno code

Link to the Csound program

  • Moving forward – things I would like to improve next:
  1. I would like to try and implement the synth on an embedded system that way I can have everything running from the box itself.
  2. I would like to build more modules to demonstrate more aspects of the synthesis process (e.g ‘the Spine’ as an arpeggiator).
  3. Implementing a better method to detect when the modules are connected and disconnected from the brain

*Special thanks Tom Igoe and Jean-Luc Cohen, this process has both taught me a lot, and challenged more throughout. Your help and guidance made this possible!

Physical Computing – Final Project / Proposal

For my final project in Physical Computing, I started by brainstorming all the ideas I had put away during the course and some inspirations that actually grew from class readings and discussions.

notebook_sketching

List of possible ideas included:

  1. DJ Midi controller
  2. Ferrofluid art installation
  3. Modular synthesizer – Building a different modular synth that allows you to assemble the instrument of your liking physically (e.g by joining pieces together).
  4. Jambot (Audio to midi) – I was thinking of creating a small listening bot that allows you to jam, and picks up audio that gets converted into MIDI notation in realtime.

I decided to keep only option 3 and 4, and after breaking down the devices into abstracted interactions, I found myself leaning towards the modular synthesizer idea.

Concept

I am fascinated by modular synthesizers, but it always struck me as devices that were actually designed ‘to feel complicated’, and by way of that, they tend to scare people who are not ‘into’ synthesis away.

My list of reference sources included:

Playtesting the idea:

I started designing what I refer to as ‘DMO’ – Different Modular Synthesiser. As we had to play test our chosen idea I decided to laser cut a simple interface and a couple of ‘modules’ and let people intuitively try and connect things, then write down the most common patterns, and try and figure out which modules make sense and which don’t.

img-20161108-wa0004

My initial idea for the system was having:

  1. Brain module that holds the basic functionality (wave shape, master volume)
  2. Have control module that magnetically link up to the sides of the brain and control elements like envelope, LFO (low-frequency oscillation) and effects. The control modules were essentially variable resistors in the form of potentiometers, sliders and FSR (force sensing resistors) that could connect to each of the parameters on the sides of the brain in a way that suits the players liking.

Let’s playtest

Playtest notes:

  1. Generally, people didn’t understand words such as envelope, LFO and wave shape types, Devise a new terminology that doesn’t insult experienced users but doesn’t scare away hobbyists and non-musicians.
  2. Use Iconography to emphasize sound concepts as a visual thing (perhaps use metaphors that people can relate to).
  3.  People didn’t understand immediately that the modules are controls, think about using the modules as sound components instead of control values ‘on the brain’.
  4. Danny Rozin mentioned fabrication as a main point, fabricating modular systems is a tough thing to do (I actually want my final to deal with fabrication, it’s challenging for me).

**Update –  After speaking with Ben Light about the fabrication of the device I decided to build two modules (i.e a brain and one more) which will connect using Neodymium magnets to transfer electricity between the two.

 

 

Physical Computing – Midterm

ezgif-4057205676

For our midterm project, I was paired with Amanda MJ Lee. We sat down and started sketching out ideas, which after a while, seemed started looking very obvious, most of our ideas dealt in some way or another with sound & music, and so we decided to make a musical instrument, luma a color-reactive audio synthesizer that uses color plates (or discs) to create musical patterns in real-time.

Sketch
Sketches of different ideas after brainstorming
  • Conceptualisation:

One of the points that kept coming up while discussing ideas, was the ability to use a well-known music consumption tool (e.g Turntable, Gramophone or Phonograph) and repurpose its interaction to create a musical instrument. This discussion turned into our project, which used the Turntable as means of symbolic relation to music, but repurposed its interaction to serve as a musical creation tool.

We decided to build upon this idea, but instead of having the device motorized, divide the instrument into two decks, rhythmical and melodic, in which only the rhythmical is motorized requiring the participant (or better yet musician) to learn and develop his own way of playing it.

ezgif-565323512

  • Design:

With the aforementioned in mind, we started thinking about visual references, we wanted the instrument to feel timeless (classic), yet have a very organic feel, since the color plates will introduce playfulness to the visual experience of using the instrument. Brown was a good visual reference

sk6_braun_450pxls
Visual reference – Bruan personal turntable

 With that in mind, we started assembling a ‘look’ and decided we are going to use a wooden box and acrylic beige cover, to convey this ‘classic look‘.

  • From sketches to fabrication:

What we used (i.e our bill of materials):

  1. Container Store Drawer Organizer Bamboo (6″ x 15″ x 2″ h) –  7.99$ – link
  2. Arduino Mega Rev 3.  – $45.95 – link
  3. 2x Color Sensors – 9.99$ each – link
  4. Linear Potentiometer (B50K) – 2.99$ – link
  5. 60mm Slider (Phidegts 1112) – 11$ – link
  6. 5 Sheets of acrylic in different colors – 60$
  7. Some basic breadboard wires

After we got all the parts we started thinking about how we would assemble our enclosure. During this discussion we realized, there were a couple of challenges everybody around us seem to be facing too, and so Amenda suggested it would be cool to try and tackle these as well for the benefit of everyone, and so to the 3D printer we GO!

img-20161024-wa0005
Measuring all the components and prototyping design ideas with cardboard

We 3D modeled all the mounts inside the case that hold the components, but we also decided to use the 3D printer to tackle design issues. For instance, as we needed to drill a USB hole inside the wooden box to connect the Arduino, we realized drilling a square hole was a mission impossible for us, and so decided to design a circular USB type B adapter that would fit inside the hole the drill press created  (the adapter can be found in Dror Ayalon and Mint’s awesome project Video Manipulations too, YAY we helped ITP)

* All 3D models used in the project are available here

We used a hot glue gun to glue the 3D printed mounts into the enclosure and started placing the sensors and components using double sided (really sticky) tape. Following that point, we started designing the interface, controls and laser cutting the actual discs – one lesson learned from that process is that when you prepare to the smallest detail, it actually is a very enjoyable one.

output_4psna4

  • Coding & Implementation:

We chose to have the Arduino analyze all the inputs from the sensors, potentiometers and sliders and communicate to the computer over the serial port. With that in mind we started placing all the logic on the Arduino side first, and later on moved to creating the synthesizer.

breadboard
Arduino breadboard illustration

We started coding with only one deck assembled, as we were multithreading design and code, trying to touch up and implement together. The fact we had only one deck available at the start, actually gave birth to a cleaner coding approach where we decided to break the functionality into small functions that deal with every part of the functionality chain separately. Here is our final loop function with comments to enable control every step of the process and even scale it very fast (1 deck or two deck, is only commenting and uncommenting a function).

Another thing we implemented at this point was the ability to tell whether a color has just started. We did this with the same logic of a button change press just utilizing a color range as the changing into/out of range which is handled by the matchLastColorState();

A couple of things we realized on the way were:

  1. If you code individual blocks of logic, it is easier to debug them separately
  2. If you communicate over serial with binary data, it is useful to have a function that you can switch on and off to debug with strings, that way you can actually read it
  3. When using sensors that are affected by ambient factors (light, sound…etc) prepare to test extensively (and then test more).
  • Building the synthesizer:

As I am also taking a software synthesis course this semester in Steinhardt’s Music Technology department, I suggested we use Csound, the system used in the course, Amanda was in and so we started writing the synth in Csound. Some of the challenges we had to face was the binary serial communication between the Arduino and Csound, building interesting instruments that would play, and deciding on the logic at which we trigger different notes, so it doesn’t repeat the same note whenever a color is detected.

ezgif-955914941

We ended up building a synth that uses 10 oscillators and 5 envelopes to create rich ambient and percussive sound textures, iterating over a pentatonic scale which makes the playing experience more engaging.

The code for the Arduino app can be found here, and the code for the synthesizer can be found here

Here is the final graphic layout we made to explain the project:

illustration2-01

Demo video:

And here is the live demo we did in class on presentation day:

Physical Computing – Lesson#7

Following Synthesis, I had so many ideas, I decided to scale the conversation between the Arduino & p5. After reviewing the labs, I decided to build a Drum Machine that uses buttons on the Arduino to trigger oscillators on the p5.sound library that mimic the sound of electronic drums (Inspired by the great 808 drum machine).

I started by sketching the idea of board

breadboard

From that point, I started assembling the breadboard

dsc05251

After finishing assembly I started writing the software on the Arduino, since I am using a lot of the same (buttons) I decided to use ‘Arrays’ to store Button pins, Current States and Last States on each button. It looks something like

Once I read a value and it’s different from the last state && has a ‘HIGH’ value I serial print each button as 1,2,3,4,5. In p5 I pick these values and assign each of them to a specific function, in essence the on/off trigger of each oscillator by a specific button on the Arduino.

Time to demo

Link to github

Synthesis // icm + physcomp = ♥

To kick off the long weekend we had the Synthesis workshop, and even though at first I was sure we were going to deal with sound synthesis we actually synthesised Intro to Computational Media with Physical Computing courses, which boils down to endless possibilities.

I was assigned with Cristobal Valenzuela, and so we started by researching what sensor we would like to use for our ‘Physical Variable’ challenge. We decided to use the Colorpal RGB Sensor and send the red, green and blue data over to p5. While we were figuring out how to best serve the data over serial to p5, we came up with the simple idea of making a game for children, where they get a name of a fruit and have to find something with the same color as the fruit and scan it.

Demo time:

To determine the correct color being sensed we used p5’s ‘dist();’ function in the following manner:

Some of the things we didn’t have time to build but disscused and agreed could be interesting to try are timing the actual process of ‘search & scan’, keeping scores and varying difficulties.

Link to github

Physical Computing – Lesson#3 – Observation

Overview

For this week’s interaction observation assignment I chose to focus on the LinkNYC booth that was recently statued close to my home, since I found the device to be an interesting case study that blurs the lines of it’s seemingly main use.

link-nyc-gear-patrol-slide-1

Interaction observation:

I chose to focus my observation study on a young couple’s interaction with the machine over a period of approx. one minute. The man equipped with a camera, and the woman holding a water bottle and a bag approached the machine and instantly started touching the screen. After tapping the screen once a map appeared then the woman looked into the screen observing as the man moves his fingers on the screen and the map moves. After about 20 seconds of moving his fingers on the screen the man pointed over to 72nd St. and started a conversation with the woman, in which she had started pointing in the other direction. As they converse, they often refer back at the screen for reference until they start walking towards 72nd St.

My first assumption was the couple are tourists visiting the city and exploring it. Secondly but probably more importantly, seems the couple initially approached the device due to it’s similarity of a handheld smartphone device in both design and interface. Since they most likely have some experience with the device’s ‘interaction language’ the use of it stated right as they approached it, i.e very short learning curve.

Connecting this interaction observation to Bret Victor’s ‘A Brief Rant On The Future Of Interaction Design’ definition of a tool by stating it addresses human needs by amplifying human capabilities, seems the LinkNYC booth might be very suitable for that category. The man’s ability to orient around NYC is limited by his knowledge of the city’s neighborhoods, streets and routes and in this sense, it seems given the booth location is actually on the street, it’s able to amplify his orientation by providing the missing knowledge. Furthermore, seems Victor’s approach that uses our hands as the means to interact with future interfaces, LinkNYC gains a lot by people’s familiarity with  interface and bundled applications, making the touch screen choice the perfect one in that sense.

General observation notes:

  • The machine itself is a rather noticeably big metal container with ad spaces on both sides facing the street and a bevel in which there is a touch screen, a metal numeric keyboard and a headphone jack.
  • The device has multiple features – free WiFi distributer, built-in tablet and a charging port to name the most popular ones.
  • It seems the device was conceptualised with an idea of solving two main types of problems at first, free NYC WiFi network and an information hub for people (maps, internet browser were the most common).  With that in mind it helps solve both a local problem for NY residents but also helps tourist orient in the city.

 

Physical Computing – Lesson#3

For this week’s class, I kept practicing the lab exercises, plus started experimenting with Arduino code & the Arduino IDE.

Digital I/O:

I started with building simple applications, turning an LED on/off, using buttons and counting the presses, predominantly learning the ‘digitalRead()’ and ‘digitalWrite()’ functions.

arduino
Assembling lights and buttons

Analog I/O:

Through reading and writing digital values (essentially a high & low values), I kept on learning how analogRead & PWM work, and found a house project I was excited doing that incorporated all of these elements.

The Moisture Checker:

I decided to create a moisture checker that lights up one of three state LEDs to notify you whether the soil is wet enough, ok or bad. Since my actual plants are in my balcony, having the LED on at all time would be both wasted power consumption and might annoy the neighbors, therefore I decided to incorporate a solution for distance sensing to initialise or remove the LEDs.

20160921_131400

I started by defining the goals and scenario in which the thing would use, and once I got this done I moved to sketching the schematic for the assembly.

screen-shot-2016-09-27-at-11-12-32

For the LEDs themselves I decided it would be calmer to use a ‘breathing’ effect by having them fade from on to off in a wave like way. With that in mind I decided to use a loop which will iterate between 0-255 and turn on and off the correct LED.

void ledEffect(int moistureState){
 //Mimic the effect of a sine wave breathing effect for the leds
 //Is there any way I can use an actual number generator to not delay the loop?
 if(counter == 0){
 for(counter = 0; counter < 255; counter++){
 analogWrite(moistureState, counter);
 delay(10);
 }
 } else if (counter == 255){
 for(counter = 255; counter > 0; counter--){
 analogWrite(moistureState, counter);
 delay(10);
 }
 } else {
 counter = 0;
 }
}

The LED breathing effect

Reading the moisture sensor over serial:

Questions for class based on assignment:

  1. I am using a ‘for’ loop to iterate over values ranging from 0-255 in order to create the breathing effect for the LEDs. With that being said I am using the ‘delay()’ function which in return delays the whole loop while the LEDs are on, is there any other way of achieving this effect? perhaps a sine wave generator I can map to the values above without delaying the loop
  2. I find the sensor readings to be very jumpy in values, what would be a good technique to ‘filter’ the values I get back from the moisture sensor?

Physical Computing – Lesson#2

For the second lesson of P.Comp I kept on practicing and investigating everything covered both in class and in the labs. By drawing ideas, trying them and failing every other time, I tried assembling different combinations of LED’s using resistors, switches, potentiometer, slider (fader) and even an attempt of using a transistor & light sensor.

output_244mld

    • Basic LED Board:
      I started out by just making a very simple and straightforward led circuit using a 220Ω resistor, powered by +5V from the Arduino R3 board.

    • LED & Button Switch:
      As this week’s assignment involved using switches I took some time to investigate some of the logic behind switches and made a very simple button switch that turns on an LED while pressed. Around this point I also started working with external power supplies in the shop, which was frightening at first (a big white block on my table?) but ended up being very easy and convenient to work with.

      • LED & Potentiometer:
        After building a simple button switch I moved on to experimenting with potentiometers and sliders, and made a LED controlled by both a Bourne B203 potentiometer, and a Phidgets N1112 slider.

To sum this week up in a visual manner, here is a compilation of different circuits I tested and learned throughout the week

Physical Computing – Lesson#1


We kicked off Physical Computing with an in-class discussion about physical human interaction following an in-class imaginative exercise. After following that discussion with further reading of Crawford’s ‘The Art of Interactive Design’ – Ch. 1-2, and Bret Victor’s ‘The Future of Interaction Design’ . I arrived at a rather different perspective on human-computer interaction, which at it’s core is somewhat of a synthesis of the two.

Starting with Crawford’s definition of Interactivity and his systematic ranking of the levels of engagement and up until Victor’s definition of a tool and use case, it seems to me there is a missing factor in current human-computer interaction in the aforementioned texts, the machine’s ability to learn patterns and mimic human associated decision making strategies that alter the Conversation’s characteristics.

Interactivity

In essence, I think the Crawford’s argument on Interactivity as ‘Conversation’ makes total sense, but lacks machine learning’s ability to improve it’s ‘Listening’ and ‘Speaking’ abilities to customise and grow over time, resulting in a better “Conversation Companion”, which in return increases human related feelings like empathy, comfort and feelings (this of course also applies for the negative aspect as well). With that being said, currently it seems the discussion around machine learning is software based and I can only assume seeing more physical applications of hardware transformations driven by machine learning software. With that in mind, I would like to further study and research the possible applications this world opens to musical instruments, composition and conventions further down the line.