Skip to main content

Algorave at Sound Thought

I shared an algorave spot at Sound Thought with Claire Quigley, that was streamed as part of the TOPLAP 15th anniversary stream. You can see her set here and mine is here, although I have to say not that happy with the way my performance turned out on the day. The rehearsal was better:

As you can see, combining my livecoding work with my passion for table tennis! And, in fact, this was a new technical discovery just the day before: it is possible to use Atom to livecode simultaneously in Hydra and SuperCollider, using a plugin for SC. I'm not sure this is intended behaviour, but in practice if you have Hydra running in Atom and then switch to a page of SC code, the Hydra visuals run on behind that code.

Amongst other things, Claire is a colleage of mine at the Royal Conservatoire of Scotland, where she teaches coding to the music education students. We're hoping to do more work together in the future.

Callout for Scottish algo-ravers 16-2-2019

I'm looking for music and/or visual artists working with live code who are interested in joining me for an improvised algorave as part of Sound Thought 2019.

‘Livecoding’ is a practice where creative artists who work with computer code perform live, often producing music and/or visuals, with the audience typically able to watch the evolution of the code on a projected screen. ‘Algorave’ is a subgenre where the aim is to produce beat-driven music and/or visuals for dancing.

Examples of the kind of software we're talking about include:

Music:

Visuals:

More about algorave https://algorave.com/ and livecoding https://toplap.org/

Drop  me an email on js.vanderwalt@rcs.ac.uk if interested!

ICLC 2019 Madrid

Some reflections on the International Conference on Livecoding 2019 in Madrid.

The play-and-tell workshop that I helped put together with Evan Raskob and Renick Bell was, as intended, a low key and informal way for people to share their individual practices in livecode. Of particular interest to me was Dimitris Kyriakoudis showing how he uses heavily customised keyboard shortcuts in emacs as a way to be completely fluent when performing: as he put it, 'typing should not get in the way of livecoding performance'. There were also some very interesting links to my mind between his Timelines system  – 'all music is a function of time' and Neil C Smith's 'AMEN $ Mother Function' performance that worked by chopping up a wavetable as a function of time.

As well as that session, I also had input to a paper entitled 'Towards Improving Collaboration Between Visualists and Musicians at Algoraves' co-authored by – deep breath – Zoyander Street, Alejandro Albornoz, Renick Bell, Guy John, Olivia Jack, Shelly Knotts, Alex McLean, Neil C Smith, Atsushi Tadokoro, J. S. van der Walt and Gabriel Rea Velasco. The creation of this paper was itself an interesting process, beginning with a conversation in Sheffield, and then continuing with us writing the paper collaboratively in a shared online space. Guy presented the paper, you can see that here.

A stand-out performance for me was Maia Koenig's 'RRayen', using some sort of hand-held games console. Great energy: I can't seem to find a video of her performing at ICLC, but here she is doing the piece elsewhere.

Of the many new livecoding systems presented, I was rather taken by Balázs Kovács slighly bonkers Makkeróni 'web-based audio operating system', kind of like an online shared bash shell that plays music.

Also very interesting was Alejandro Franco and Diego Villaseñor and presenting their Nanc-in-a-Can Canon Generator. The cultural backround was fascinating, with an attention to reclaim Nancarrow as a Mexican composer as explained in the talk.

I've never been to Madrid before, but found it was an easy place to be: dry, cold, comfortable and easy to get around. ICLC 2020 is to be in Limerick, fairly local for me, so I'll be looking to present or perform there as well.

werk stadig

Here is the piece I contributed to the Sounding Nature project on Cities and Memory:

https://clyp.it/btdilbxd

It is a reworking of an audio file called ‘093 SOUTH AFRICA savannah polyrhythms’. As someone who spent part of their childhood in South Africa, the bird sounds in the source recording are very familiar to me: most particularly the distinctive monotonous call of Streptopelia capicola, the Ring-necked Dove, or, as I used to call it, the Cape Turtle Dove, the name given in the edition of Roberts’ Birds of Southern Africa that I owned at the time. In the current edition of Roberts the call is transliterated as ‘work harder’, but in the older volume it is given in Afrikaans as ‘werk stadig’ which, given the slightly harsher sound of that language, actually works rather better.

I always thought ‘werk stadig’ meant ‘work steadily’ but it seems a more accurate translation would be ‘work slowly’. Whichever way: for several years now I have been working steadily, or slowly, through a process of learning the SuperCollider programming language. This composition is to some extent a study in that language: yet another attempt to use livecoding approaches as a means to develop a fixed piece. New ideas in this work include FFT as a means of cleaning up the original recording, and the use of a Routine to script JITLib objects in time.

Interview at Sheffield Algorave

A short interview that Reverb Magazine did with me at the Sheffield Algorave 01/09/2018 – talking about combining livecoding, gamelan samples, and trumpet playing.

Raving the netbook again

Once again happily proving to myself how possible it is to work with open-source software on basic hardware. Just upgraded to Ubuntu Studio 18.04 on a refurb 11" Dell Inspiron netbook, and built SuperCollider 3.9.3 from source. Here's an algorave-ish test track made using this setup:

https://clyp.it/5d3lo4na

Some new code idioms:

Plazy({Pseq((0..15).scramble,4)}).repeat(inf)

is easier to type than

Pn(Plazy({Pseq((0..15).scramble,4)}))

and similarly

Pseq([2,6,4,7],inf).stutter(32)

instead of

Pstutter(32, Pseq([2,6,4,7],inf))

also

Pseq((0..15).scramble,inf).clump(3)

Livecoding Erraid

On a number of occasions I have used sounds collected at a particular location as a coherent set of resources for a livecoded set. For the last week I've been in retreat on with the community on the isle of Erraid, which has been a welcome break from the city!

One of the features of the island is the 'observatory'. This is a circular tin structure, about two meters across by three high: a restored remnant of the building of the Dubh Artach lighthouse that took place there between 1867 and 1872.

The sound world inside this unusual structure is distinctive. I took some recordings (available on freesound.org, or they will be once the finish uploading), that I am going to be using in a livecoded SuperCollider improvisation this Monday during one of the 'Sonic Nights' series at the Royal Conservatoire of Scotland, where staff and students diffuse new electroacoustic works on a multi-channel sound system. If it seems practical, I may stream the performance as well.

Not algorave

I'm interested in now taking the SuperCollider livecoding techniques that I've developed in the context of algorave and applying them to the creation of fixed media sound works. Here is one, using some prepared piano samples that Dr Kurt James Werner has been kind enough to put online.

pylon-country.mp3

It's not perfect: there is still a strong element of improvisation in this way of working, and there are places in this track where, on listening back, I might have wished to have performed differently. A compromise, perhaps, between the raw and the cooked.

Livecode improvisation with Anne-Liis Poll

As part of the team that organised the third METRIC Improvisation Intensive at the Royal Conservatoire of Glasgow, I did not have as much time as I might have liked to improvise myself. I was pleased however to be joined for an impromptu livecoded session by Anne-Liis Poll, Professor of Improvisation at the Estonian Academy of Music and Theatre:

This did not quite turn out the way I had intended! In recent work I have been looking to find a way to respond in code to live human improvisations: this session turned into more of an algorave-ish groove built up from mechanical trumpet sounds, over which Anne-Liis worked with the voice. Even so, this was quite succesful. I hope to do more playing with other people along these lines.

Livecoding again

Back at the livecoding again. A couple of weeks ago, a quite succesful workshop for the students on the Interactive Composition module at the Royal Conservatoire of Scotland. Coming up: a couple of things. In March there is going to be another long-form online algorave that I'll be contributing a half-hour set to, Friday 16th at 1330 GMT. In April the METRIC Intensive III at the RCS sees staff and students converge on Glasgow for a week of improvisation: again, as well as leading some gamelan improvisation, expect to be SuperColliding as well.

Below, a more-or-less unedited trial run of some new stuff tonight: specifically, a collection of samples made purely from mechanical sounds of my trumpet, close-miked: springs, valve noise, slide pops and so forth.