Sound from SuperCollider to OBS on macOS

The principle here is that sound is sent from SuperCollider to the virtual audio driver BlackHole, and from BlackHole to OBS. You then listen to the sound through OBS.

Install https://github.com/ExistentialAudio/BlackHole

screen-shot-2020-03-14-at-09.42.56.png

Set the mac sound output to ‘BlackHole 16ch’ using the widget in the menu bar:

Boot or reboot the server in SuperCollider – this is a key step, the SC server will not pick up a change of audio device without a fresh boot or a reboot:

After boot, check that the SC post window says ‘”BlackHole 16ch” Output Device’:

(If you’re using Tidal you’ll need to run SuperDirt.start here again.)

In OBS, click on ‘Settings’ and go to the ‘Audio’ pane. Set ‘Mic/Auxillary Audio’ to ‘BlackHole 16ch’ and ‘Monitoring Device’ to ‘Built-in Output’:

Screen_Shot_2020-03-15_at_19_56_47.png

In OBS, look for a gear icon for settings in the Audio Mixer panel select ‘Advanced Audio Properties’:

Screen_Shot_2020-03-14_at_09_47_41.png

Under Audio Monitoring select ‘Monitor and Output’:

Screen_Shot_2020-03-15_at_20_06_42.png

Play a sound in SuperCollider. In OBS, turn up the slider in the Audio Mixer Panel, and you should see the Mic/Aux slider picking up sound:

Sound should now be playing from SC into OBS and through into your speakers/headphones. Adjust volume as usual from the widget in the menu bar.

That syncing feeling

In order to be able to work up sketches for ‘Perang Gagal’ at ICLC in Limerick, I wanted to use Logic to compose demos of the material for the live players that I could then improvise with in SuperCollider. This poses the problem of how to sync the pulse and tempo between the two programmes, which proved annoyingly difficult to accomplish!

Ideally, I would have liked to set the tempo in SuperCollider for Logic to follow. A straightforward way to do this would have been for SC to send MIDI clock and have Logic follow but, annoyingly, Logic does not support slaving to MIDI clock.

According to the Logic help, it should be possible to sync to an audio click from Logic. I couldn’t get this to work, and nobody on the Logic Users Group seemed to be able to help either.

The eventual solution was less than perfect. I used Logic to send MIDI clock, and had SuperCollider slave to that. This involved using the MIDISyncClock extension from H. James Harkins ddwMIDI quark. Not perfect, but got the job done.

What did work very well indeed was the recently released Blackhole tool for passing audio between mac applications. I’d definitely recommend this as a replacement for Soundflower!

Livecoding gamelan

The work that I will be taking to ICLC 2020 in Limerick is entitled ‘Perang Gagal: a Series of Inconclusive Battles’, and is a collaboration with Professor Mel Mercier at the Irish Word Academy of Music and Dance. I will performing livecode in SuperCollider as part of a small gamelan ensemble let by Mel. Here’s the demo video I submitted to the conference call:

I had thought that the eventual piece would be straightforward to devise, but it is proving trickier than I thought. There are couple of limitations. We won’t have access to a full gamelan for the conference. I had hoped to visit Limerick to work with the players in advance, but that has not proved possible: instead, I am going to send sketches of the material I am working on to Mel, and we will put the piece together during the conference.

The third limitation is around pulse. I want this piece to be rhythmic, but I do not feel confident about trying to get SuperCollider to follow the tempo and pulse of a live ensemble of gamelan musicians. Consequently, I am having to devise material where SuperCollider establishes some sort of groove that the live players will follow.

So far I have four potential sections for the piece. As ever, I am reworking existing materials. ‘fibblesticks’ and ‘Adrift & Afloat’ are ‘counting pieces’ that employ numerical frameworks to allow performers to play together in time, while leaving pitch inderminate: or rather, when working with the gamelan, projecting the entire complement of available notes, pelog in this case.

I have on a number of occasions performed a sort of quasi-Javenese gamelan texture in SuperCollider, using samples of the Spirit of Hope instruments here in Glasgow. For Limerick, I have reworked this by adding a balungan part for the live players.

The fourth section for the piece is new, and is based around a couple of musical ideas that occured to me in a dream and that were still in my head on awakening:

Many of my musical ideas originate in this way!

Livecoding brass

As 2019 draws to a close, I’m spending some time getting ready for the International Conference on Livecoding in February in Limerick. I put in two proposals. The first of these was to be called The ‘All-Pressure No-Method’ System, and would have involved me working with four live brass players. I say ‘would have’: this has had to be abandoned, we were not able to fund the travel and accomodation for the players.

The central idea, however, is one I’d like to return to. Inspired particularly by the work of Kate Sicchio in livecoding dancers, the intention was to livecode the brass players by means of a repertoire of typed and projected instructions. Here’s a demo video of the concept:

 

Algorave at Sound Thought

I shared an algorave spot at Sound Thought with Claire Quigley, that was streamed as part of the TOPLAP 15th anniversary stream. You can see her set here and mine is here, although I have to say not that happy with the way my performance turned out on the day. The rehearsal was better:

As you can see, combining my livecoding work with my passion for table tennis! And, in fact, this was a new technical discovery just the day before: it is possible to use Atom to livecode simultaneously in Hydra and SuperCollider, using a plugin for SC. I’m not sure this is intended behaviour, but in practice if you have Hydra running in Atom and then switch to a page of SC code, the Hydra visuals run on behind that code.

Amongst other things, Claire is a colleage of mine at the Royal Conservatoire of Scotland, where she teaches coding to the music education students. We’re hoping to do more work together in the future.

Callout for Scottish algo-ravers 16-2-2019

I’m looking for music and/or visual artists working with live code who are interested in joining me for an improvised algorave as part of Sound Thought 2019.

‘Livecoding’ is a practice where creative artists who work with computer code perform live, often producing music and/or visuals, with the audience typically able to watch the evolution of the code on a projected screen. ‘Algorave’ is a subgenre where the aim is to produce beat-driven music and/or visuals for dancing.

Examples of the kind of software we’re talking about include:

Music:
Sonic Pi http://sonic-pi.net/
TidalCycles http://tidalcycles.org/
FoxDot http://foxdot.org/
SuperCollider https://supercollider.github.io/
Troop https://github.com/Qirky/Troop
Estuary http://intramuros.mcmaster.ca:8002/
Gibber http://gibber.cc/
ChucK http://chuck.cs.princeton.edu/
Pd https://puredata.info/
Overtone http://overtone.github.io/

Visuals:
Hydra https://github.com/ojack/hydra
LiveCodeLab https://livecodelab.net/
VEDA https://veda.gl/
PraxisLIVE https://www.praxislive.org/
Processing https://processing.org/
fluxus http://www.pawfal.org/fluxus/
The Force https://videodromm.com/The_Force/
LiveCoder http://livecoder.net/

More about algorave https://algorave.com/ and livecoding https://toplap.org/

Drop  me an email on js.vanderwalt@rcs.ac.uk if interested!

werk stadig

Here is the piece I contributed to the Sounding Nature project on Cities and Memory:

https://clyp.it/btdilbxd

It is a reworking of an audio file called ‘093 SOUTH AFRICA savannah polyrhythms’. As someone who spent part of their childhood in South Africa, the bird sounds in the source recording are very familiar to me: most particularly the distinctive monotonous call of Streptopelia capicola, the Ring-necked Dove, or, as I used to call it, the Cape Turtle Dove, the name given in the edition of Roberts’ Birds of Southern Africa that I owned at the time. In the current edition of Roberts the call is transliterated as ‘work harder’, but in the older volume it is given in Afrikaans as ‘werk stadig’ which, given the slightly harsher sound of that language, actually works rather better.

I always thought ‘werk stadig’ meant ‘work steadily’ but it seems a more accurate translation would be ‘work slowly’. Whichever way: for several years now I have been working steadily, or slowly, through a process of learning the SuperCollider programming language. This composition is to some extent a study in that language: yet another attempt to use livecoding approaches as a means to develop a fixed piece. New ideas in this work include FFT as a means of cleaning up the original recording, and the use of a Routine to script JITLib objects in time.