Friday, February 10, 2006
Only the mystically inclined would argue for an absolute correlation between sound and vision. There are those such as the Russian composer Alexander Scriabin who believed in a strict parallel between musical pitch and colour; with C being red, D yellow and so on. However in searching for new dialogues between picture and audio it soon becomes apparent that there is any number of ways of creating correspondence.
Moving images have been accompanied by music and sound almost from the start of cinema and the history of visual counterpoints to music is as old as that of dance. A lot of recent work in max/msp/jitter has focussed on gestural movement as a means of controlling synthesis. By using motion detection for example, the movement of a performers arm can trigger a sequence of notes or treat a sound in some way. Whilst this is all driven by a laudable attempt to bring some real-time interactivity to the performance of electronic music, such performed sound can often seem like amateur choreography or simply unwieldy instrument interfaces.
The motion detection abilities of max/msp/jitter can however be used to interact with movie files so that motion of the frames (either in or between) can trigger notes or samples. I have been experimenting with developing a proto app to do this (codename Instant Kitten after the Matching Mole song of the same name)and it is interesting how when using straightforward instrumentation the results are similar to those of a jazz/improv pianist. Here then is an example or study. Some images of the elephant and castle where sourced from the net, these were then cut together in an impromptu fashion. The resulting mov file was then fed into Instant Kitten, which supplied the “musical” accompaniment. To see some Instant Elephant look here here