mi meets Copenhagen

Jérôme and James recently had the opportunity to meet, work (and drink beers) with the researchers from the Aalborg University in Copenhagen. During a week, spent mostly within the Multisensory Experience Lab, we exchanged about physical modelling, creative interaction in VR environments, presented our mass-interaction tools, experimented some resolutely weird stuff (e.g. Silvin’s infamous 4D string)… and even put in a small appearance in an “SMC by Night” performance !

For the occasion we decided to test the multitouch Sensel Morph devices that were available in the lab to drive sound-generating or visual mass-interaction physical models (resulting in some very last-minute patching and coding). We documented the process with a couple of videos, shown below:

James does visual stuff (and it's a first)

In this model, a propagation mesh, composed of 37000 3D oscillators and springs and running in Processing, is excited by the pressure applied on the Morph device in any number of points. The multitouch data is sent to the physical model over OSC.

The physical properties of the model can be changed in real time:

  • Oscillators stiffness (how fast each one returns to its equilibrium)
  • Oscillators damping (regime when returning to equilibrium : under or over-damped, etc.)
  • Mesh springs stiffness (how fast energy propagates to neighboring elements)
  • Mesh springs damping (how fast energy dissipates during propagation)

Using a MIDI controller to alter the parameters in real time, we can go from wave-like ripples, to “plastic” engraving effects, or anything in between. The visualisation can also be configured so as to only show motion (i.e. the elongation of the mesh springs), for a more dynamic/abstract rendering.

Jérôme plays physical models (and it's a first)

This second experiment is composed of seven distinct mini-models of beams (for a total of 740 modules) running in Processing. Each one of them differs by its geometry and parameters and, while they all rely on the very same elementary concept, their differences open to a large variety of sounds. These beams are controlled via the Morph device (running in an another Processing sketch and mapped through OSC) and via the laptop trackpad.

Some of the mapping strategies are :

  • The Morph device surface is virtually separated into seven columns, each one dedicated to one virtual beam model
  • The pressure applied on the Morph is directly mapped to the amount of energy repetitively “pulsed” into a beam
  • The vertical position of fingers on the Morph controls both the internal viscosities of the beams (making these sweet transitions between wooden sounds and metallic ones), and the position where they are excited
  • The laptop trackpad controls a free mass able to interact, bend, mute, excite, each one of the beams

The post processing is done using Ableton Live.