That syncing feeling

In order to be able to work up sketches for ‘Perang Gagal’ at ICLC in Limerick, I wanted to use Logic to compose demos of the material for the live players that I could then improvise with in SuperCollider. This poses the problem of how to sync the pulse and tempo between the two programmes, which proved annoyingly difficult to accomplish!

Ideally, I would have liked to set the tempo in SuperCollider for Logic to follow. A straightforward way to do this would have been for SC to send MIDI clock and have Logic follow but, annoyingly, Logic does not support slaving to MIDI clock.

According to the Logic help, it should be possible to sync to an audio click from Logic. I couldn’t get this to work, and nobody on the Logic Users Group seemed to be able to help either.

The eventual solution was less than perfect. I used Logic to send MIDI clock, and had SuperCollider slave to that. This involved using the MIDISyncClock extension from H. James Harkins ddwMIDI quark. Not perfect, but got the job done.

What did work very well indeed was the recently released Blackhole tool for passing audio between mac applications. I’d definitely recommend this as a replacement for Soundflower!

Livecoding gamelan

The work that I will be taking to ICLC 2020 in Limerick is entitled ‘Perang Gagal: a Series of Inconclusive Battles’, and is a collaboration with Professor Mel Mercier at the Irish Word Academy of Music and Dance. I will performing livecode in SuperCollider as part of a small gamelan ensemble let by Mel. Here’s the demo video I submitted to the conference call:

I had thought that the eventual piece would be straightforward to devise, but it is proving trickier than I thought. There are couple of limitations. We won’t have access to a full gamelan for the conference. I had hoped to visit Limerick to work with the players in advance, but that has not proved possible: instead, I am going to send sketches of the material I am working on to Mel, and we will put the piece together during the conference.

The third limitation is around pulse. I want this piece to be rhythmic, but I do not feel confident about trying to get SuperCollider to follow the tempo and pulse of a live ensemble of gamelan musicians. Consequently, I am having to devise material where SuperCollider establishes some sort of groove that the live players will follow.

So far I have four potential sections for the piece. As ever, I am reworking existing materials. ‘fibblesticks’ and ‘Adrift & Afloat’ are ‘counting pieces’ that employ numerical frameworks to allow performers to play together in time, while leaving pitch inderminate: or rather, when working with the gamelan, projecting the entire complement of available notes, pelog in this case.

I have on a number of occasions performed a sort of quasi-Javenese gamelan texture in SuperCollider, using samples of the Spirit of Hope instruments here in Glasgow. For Limerick, I have reworked this by adding a balungan part for the live players.

The fourth section for the piece is new, and is based around a couple of musical ideas that occured to me in a dream and that were still in my head on awakening:

Many of my musical ideas originate in this way!

Livecoding brass

As 2019 draws to a close, I’m spending some time getting ready for the International Conference on Livecoding in February in Limerick. I put in two proposals. The first of these was to be called The ‘All-Pressure No-Method’ System, and would have involved me working with four live brass players. I say ‘would have’: this has had to be abandoned, we were not able to fund the travel and accomodation for the players.

The central idea, however, is one I’d like to return to. Inspired particularly by the work of Kate Sicchio in livecoding dancers, the intention was to livecode the brass players by means of a repertoire of typed and projected instructions. Here’s a demo video of the concept:

Radio Automata

Here’s the collaboration that Bill Whitmer and I did for Radiophrenia

Algorave at Radiophrenia

This week I’m taking my algorave work in a new direction. In collaboration with Bill Whitmer, we are going to be presenting a half-hour show called ‘Radio Automata Live in the Studio’ as part of Radiophrenia, a temporary art radio station broadcasting from the CCA in Glasgow.

The idea for the show is: if the last remaining creative decisions in broadcast radio were entirely automatic, would anyone notice? Bill has been experimenting with algorithmically generated text and chatbots for the spoken part of the show. For my part, I’m going to be creating cut-up mashups using the slicing techniques I’ve been developing in SuperCollider.

In previous work along these lines, I’ve always used source material that was either explicitly open source, or, at least, grey-area material that I was unlikely to be sued for, like old TV themes and midi module demo songs.

In this show, for the first time I’m taking a so-sue-me approach, using… well, I won’t give the game away, but some *very* well known material indeed, arising from ‘suggestions’ ‘made’ by the bots Bill has been working with. In early experiments this is sounding very interesting indeed. Watch this space, or rather, listen to this wavelength!

  • 23rd May 2019
  • 1100-1130
  • 87.9FM across Glasgow

Livecoding with Robert Henderson

Improviser Núria Andorrà visited Glasgow in March to teach on the International Collaboration in Contemporary Improvisation module at the Royal Conservatoire of Glasgow. My colleage Una McGlone took the opportunity to organise a gig for her at Hairdressers, in collaboration with a number of Glasgow improvisers.

I did a short set alongside trumpet player Robert Henderson, who I have known for many years: in fact, I know him from the period around twenty years ago where I myself was active as a gigging trumpet player! For this performance, I used a bank of sounds that I have created using purely mechanical sounds from a trumpet, the metal, valves, valve slides and so forth. It’s always slightly problematic livecoding alongide an actual analog musician, as it is not easy to respond particularly rapidly to another player. However, Robert and I enjoyed playing together and managed to create some satisfying musical gestures together.

No recording made, unfortunately!

nuria gig flyer.png

Algorave at Sound Thought

I shared an algorave spot at Sound Thought with Claire Quigley, that was streamed as part of the TOPLAP 15th anniversary stream. You can see her set here and mine is here, although I have to say not that happy with the way my performance turned out on the day. The rehearsal was better:

As you can see, combining my livecoding work with my passion for table tennis! And, in fact, this was a new technical discovery just the day before: it is possible to use Atom to livecode simultaneously in Hydra and SuperCollider, using a plugin for SC. I’m not sure this is intended behaviour, but in practice if you have Hydra running in Atom and then switch to a page of SC code, the Hydra visuals run on behind that code.

Amongst other things, Claire is a colleage of mine at the Royal Conservatoire of Scotland, where she teaches coding to the music education students. We’re hoping to do more work together in the future.

Callout for Scottish algo-ravers 16-2-2019

I’m looking for music and/or visual artists working with live code who are interested in joining me for an improvised algorave as part of Sound Thought 2019.

‘Livecoding’ is a practice where creative artists who work with computer code perform live, often producing music and/or visuals, with the audience typically able to watch the evolution of the code on a projected screen. ‘Algorave’ is a subgenre where the aim is to produce beat-driven music and/or visuals for dancing.

Examples of the kind of software we’re talking about include:

Music:
Sonic Pi http://sonic-pi.net/
TidalCycles http://tidalcycles.org/
FoxDot http://foxdot.org/
SuperCollider https://supercollider.github.io/
Troop https://github.com/Qirky/Troop
Estuary http://intramuros.mcmaster.ca:8002/
Gibber http://gibber.cc/
ChucK http://chuck.cs.princeton.edu/
Pd https://puredata.info/
Overtone http://overtone.github.io/

Visuals:
Hydra https://github.com/ojack/hydra
LiveCodeLab https://livecodelab.net/
VEDA https://veda.gl/
PraxisLIVE https://www.praxislive.org/
Processing https://processing.org/
fluxus http://www.pawfal.org/fluxus/
The Force https://videodromm.com/The_Force/
LiveCoder http://livecoder.net/

More about algorave https://algorave.com/ and livecoding https://toplap.org/

Drop  me an email on js.vanderwalt@rcs.ac.uk if interested!

ICLC 2019 Madrid

Some reflections on the International Conference on Livecoding 2019 in Madrid.

The play-and-tell workshop that I helped put together with Evan Raskob and Renick Bell was, as intended, a low key and informal way for people to share their individual practices in livecode. Of particular interest to me was Dimitris Kyriakoudis showing how he uses heavily customised keyboard shortcuts in emacs as a way to be completely fluent when performing: as he put it, ‘typing should not get in the way of livecoding performance’. There were also some very interesting links to my mind between his Timelines system  – ‘all music is a function of time’ and Neil C Smith‘s ‘AMEN $ Mother Function‘ performance that worked by chopping up a wavetable as a function of time.

As well as that session, I also had input to a paper entitled ‘Towards Improving Collaboration Between Visualists and Musicians at Algoraves‘ co-authored by – deep breath – Zoyander Street, Alejandro Albornoz, Renick Bell, Guy John, Olivia Jack, Shelly Knotts, Alex McLean, Neil C Smith, Atsushi Tadokoro, J. S. van der Walt and Gabriel Rea Velasco. The creation of this paper was itself an interesting process, beginning with a conversation in Sheffield, and then continuing with us writing the paper collaboratively in a shared online space. Guy presented the paper, you can see that here.

A stand-out performance for me was Maia Koenig‘s ‘RRayen’, using some sort of hand-held games console. Great energy: I can’t seem to find a video of her performing at ICLC, but here she is doing the piece elsewhere.

Of the many new livecoding systems presented, I was rather taken by Balázs Kovács slighly bonkers Makkeróni ‘web-based audio operating system’, kind of like an online shared bash shell that plays music.

Also very interesting was Alejandro Franco and Diego Villaseñor and presenting their Nanc-in-a-Can Canon Generator. The cultural backround was fascinating, with an attention to reclaim Nancarrow as a Mexican composer as explained in the talk.

I’ve never been to Madrid before, but found it was an easy place to be: dry, cold, comfortable and easy to get around. ICLC 2020 is to be in Limerick, fairly local for me, so I’ll be looking to present or perform there as well.

werk stadig

Here is the piece I contributed to the Sounding Nature project on Cities and Memory:

https://clyp.it/btdilbxd

It is a reworking of an audio file called ‘093 SOUTH AFRICA savannah polyrhythms’. As someone who spent part of their childhood in South Africa, the bird sounds in the source recording are very familiar to me: most particularly the distinctive monotonous call of Streptopelia capicola, the Ring-necked Dove, or, as I used to call it, the Cape Turtle Dove, the name given in the edition of Roberts’ Birds of Southern Africa that I owned at the time. In the current edition of Roberts the call is transliterated as ‘work harder’, but in the older volume it is given in Afrikaans as ‘werk stadig’ which, given the slightly harsher sound of that language, actually works rather better.

I always thought ‘werk stadig’ meant ‘work steadily’ but it seems a more accurate translation would be ‘work slowly’. Whichever way: for several years now I have been working steadily, or slowly, through a process of learning the SuperCollider programming language. This composition is to some extent a study in that language: yet another attempt to use livecoding approaches as a means to develop a fixed piece. New ideas in this work include FFT as a means of cleaning up the original recording, and the use of a Routine to script JITLib objects in time.