Live Algorithms Concert

A concert held as part of the Live Algorithms workshop at AISB 2014

Council Chambers, Deptford Town Hall, April 2nd, 18.45-19.45

Paul Hession and Alex Mclean. Code Duo.

Hession/McLean is a free improvising duo that confounds the usual classifications of musical genres. Hession has worked mostly within the jazz and free improvisation scenes and is now exploring working with live electronics and McLean has developed an improvised approach to electronic music, largely within dance-oriented genres, via live coding. In this duo Hession's drum set is extended using both analogue and digital technology, and McLean works as a live coder, writing software to generate music during a performance. This performance provides context for this collaboration, focussing on the development of Hession's practice, and his associated research programme; in particular the role of live algorithm techniques in providing a surrogate playing partner, combined with analogue technology inspired by the pioneering work of percussionist Tony Oxley. The focus is on the musical considerations at play, and how live algorithms may sit within a melange of physical, analogue and digital technology, and within gestural/instrumental and symbolic/linguistic approaches to improvisation.

Ryo Ikeshiro. Construction in Kneading

Construction in Kneading is a live “audiovisualisation” where the same data and process generate both the sound and the moving image in real-time without either one following the other as seen in most VJ practice and visualisations. 

The generative system behind the work is the Mandelbox fractal, one of several recent multi-dimensional fractals inspired by the famous Mandelbrot set. The recursion formula upon which the escape-time fractal is based is similar to that of the so-called Baker Map and resembles the actions of kneading dough in bread making. Through this relatively simple process carried out in three dimensions and controlled through the real-time manipulation of the variables, complex patterns arise from which all the audio and the visuals are generated. It is implemented in the programming environment Max/MSP/Jitter. 

The simultaneous use of audio and visuals serve a didactic purpose in allowing the processes behind the work to be perceived. However, many of the changes of varying types occurring in a performance can be triggered by either the human performer or the generative system. Thus it is often difficult to determine which of the two is responsible for each “decision”. In effect, the two conceptually become fused into a combined cyborg-like or machinic entity

Richard Hoadley. Quantum² Canticorum

Quantum² Canticorum is the latest in a sequence of musical compositions in which dance and music interact using body-tracking technologies and bespoke sensing devices. Expressive movement is converted into data which trigger and modulate expressive algorithms. These generate in real-time both audio material and music notation which is performed live.

Thanks to AISB for sponsorship, and to Tom Mudd for sound engineering.

Comments