Saturday 22 March 2014

LiveLight | Collaboration with Oz Collective (Part Two)



Demonstrating the final version of LiveLight

In order to progress from Prototype Two to the final version (above), I had to find solutions to a few things.

16-note Polyphony

To accomodate more users, I set out to increase the amount of notes that could be played at the same time. I tried 24 and 32 instances of my synth patch - synthjuan - but it seemed that only 16 would work smoothly!


Stuck Notes

Some triggered midi notes would not receive a note off message, continuing to play as a constant tone even as new notes were triggered. This was probably due to the large amount of notes that were being triggered. After hunting through the Max library, I finally found the solution in the flush object, which sends a 'note off' message to all midi notes that have been triggered. I integrated it on a timed switch (metro 2000), so that it would 'flush' every 2 seconds:


Quick Fix Filter!

I used a low-pass filter to remove the audible 'clicks' that would occasionally be created due to the intervals between triggered notes being very short. Naturally the LPF softened the final sound output by reducing its overall brightness, as well as reducing the audible distortion that would occur when multiples of the same synth note were trigger concurrently:


Preventative Measures

One design request made by ZY was to find a solution to the potential 'problem' of the audience shining light right into the camera lens from a close distant. Doing this would prevent other people within the space from playing, since the whole image would be over-exposed. Prototype 1 had the solution! I integrated the jit.3m object (plus a few operations) to calculate the total RGB value at regular intervals:


Next, using the past object (mentioned previously), a soft crescendo of pink noise would be produced whenever the RGB values exceeded a certain threshold.


Variations

In order to create more structure to the experience of sound within the piece, I decided to integrate variations on the tempo, note scales and delay effects, decided by randomly produced numbers at regular timed intervals (see / listen to video above):


The resulting composition - although somewhat simplistic in its treatment of materials - aspires towards being somewhere between generative and interactive music. Though these variations are programmed there are still unique moments to be heard within the piece, since each audible state is dependent on both how the user plays with it and the random numbers produced at each interval.
 
Conclusion + Future Development

I feel this project was a success, especially given the amount of time available to execute the work. I very much enjoyed working with Oz collective although ideally it would have been better to work closer together - in the physical sense. Hopefully we can do this if / when we work together again!

It was certainly a challenge to delve deeper into Max and expand my vocabularly, incrementally. As with any language (programming or otherwise), in taking steps to becoming more fluent in it, one's ability to generate ideas and find solutions to problems encountered increases steadily.


This was a back-up solution for 'stuck notes' - producing 1's and 0's to turn on / off signal gates for each triggered synth note.

Although the overall feeling from this project is a positive one, there are still some things that could be improved in future versions of it. On the sonorous side of things I would certainly look at going deeper in on the 'variation' aspect of the work, thinking about how to make dynamic changes dependent on user input, as opposed to randomly generated numbers. I would also like to broaden the sound palette so that instead of solely using staccato notes at varying tempos & pitches, there would also be variation on note durations (perhaps in relation to how long the visual feedback lasts) and more harmonic development (considering chord progression, etc). It would also be worth considering how RGB data could be used independently to form a tighter relationship between sound and image (i.e. each colour could have a corresponding sound palette).

Peripheral light from road (white / yellow-ish blobs either side of the blue lattice)

In terms of space, one problem encountered was that the road parallel to the installation produced light that was continually read by the camera. This meant the 'canvas' was never blank and likewise, there was always a few audible tones present, even before any user input. 

During the testing phase (within the space itself), it would have been good to measure the approximate height and width of the cameras view at different depths within the space. This would provide a more accurate way to demarcate the space for the audience, and likewise provide precise limits for the y values, as determined by the floor and or the height reachable by the audience.

It's all a learning process!

* Installation Documentation Video to follow*

No comments:

Post a Comment