First public livecode

Last night I stumbled into my first public outing of some livecoding I’ve been working on in SuperCollider. The context was an improvisation night called In Tandem run by Bruce Wallace at the Academy of Music and Sound in Glasgow. I hadn’t intended to play, as I really don’t feel I’m ready yet, but I had my laptop and cables with me, they had a projector, so…!

I was jamming along with three other people, on bass, guitar and analog synth. It all went by in a blur, but everyone there seemed to think what I was doing was ok – mostly making grooves out of a random collection of drum samples, but running some algorithmically chosen chords as well.

The code is below: this is my screen exactly as I left it at the end of the night, mistakes and all. Although Toplap say ‘show us your screens’, they don’t say ‘show us your code’, but… it seems the right thing to do.

// the end!
// they still going
// if you're curious, this is SuperCollider
// musci programming language
// writing code live is called, er, livecoding
// i'm just starting out
"/Users/jsimon/Music/SuperCollider Recordings/hitzamples/".openOS;

Pdef.all.clear; // clear things out
~hitzpath="/Users/jsimon/Music/SuperCollider Recordings/hitzamples/"; // a folder of samples
~hbufs = (~hitzpath ++ "*.aiff").pathMatch.collect({ |i|, i)}); // samples into an array of buffers
t = TempoClock(140/60).permanent_(true); // tempo 140 bpm
u = TempoClock(140/60 * 2/3).permanent_(true); // tempo 140 bpm * 2/3
SynthDef(\bf, {|out=0 buf=0 amp=0.1 freq=261.6255653006|
var sig =, buf, * freq/60.midicps, doneAction:2);, sig * amp)
}).add; // this whole chunk defines a synth patch that plays samples

// Pdef.all.clear;
//"/Users/jsimon/Music/SuperCollider Recordings/".openOS;
// t.sync(140/60, 16);

(instrument: \bf, \buf: ~hbufs.choose).play; // play an event using the synth called \bf
// pick a randoms sample from the array
(instrument: \bf, \buf: ~z).play;
~z = ~hbufs.choose;

t.sync(140/60, 32); // gradual tempo changes possible
u.sync(140/60 * 2/3, 16);
v.sync(140/60 * 5/3, 16);

Pbindef(\x, \instrument, \bf, \buf, ~hbufs.choose).play(t).quant_(4);
Pbindef(\y, \instrument, \bf, \buf, ~hbufs.choose).play(u).quant_(4);
Pbindef(\z, \instrument, \bf, \buf, ~hbufs.choose).play(v).quant_(4);
Pbindef(\z, \instrument, \bf, \buf, ~hbufs.choose).play(v).quant_(4);
~g1 = {~hbufs.choose}!16; // choose sixteen samples at random = one bar full
~g2 = {~hbufs.choose}!16;
Pbindef(\x, \buf, Pseq(~g1, inf)); // play those sixteen samples chosen
Pbindef(\x, \buf, Pseq(~g2, inf)); // different sixteen, so, a variation.
Pbindef(\x, \dur, 0.5);
~d1 = {2.rand/10}!16;
~d2 = {2.0.rand/10}!16;
Pbindef(\x, \amp, Pseq(~d1, inf));
Pbindef(\x, \amp, 0.2);
Pbindef(\x, \note, Prand((-36..0), inf));
Pbindef(\x, \note, Pseq({(-24..0).choose}!16, inf)); // pitch each sample down by random amount
Pbindef(\x, \note, nil);

// hmm. blx diminished, that's just C major!
// was using \degree instead of \note, better sounds a bit more like messiaen now :)
~c = {var x = Scale.diminished2.degrees.scramble.keep(4).sort; x.insert(1,(x.removeAt(1)-12))};
// hexMajor thing also works beautifully now!
~c = {var x = Scale.hexMajor6.degrees.scramble.keep(4).sort; x.insert(1,(x.removeAt(1)-12))};

// next question might be changing \note, \dur and \root in a coordinated way
Pbindef(\k, \note, Pstutter(Prand([5,7,9,11,13]*2, inf), Pfunc(~c)),
\dur, 0.5,
\root, 3, // best option for feeling of key change
\amp, Prand((2..5)/70,inf)

Recursive synthesis


I’ve been working for a while with an improvising setup that uses what is sometimes jokingly called ‘recursive synthesis’ – that is, plugging an effect unit back in to itself and experimenting with the no-input feedback sounds.

Today I’ve had some success with the next step in developing this system. I’ve written a SuperCollider patch that allows me to gate and pitchshift the feedback sounds, so that I can begin to find a way to play them musically using a keyboard. Here’s the very first run at playing this system: careful, some rather loud and uncontrolled noises here!


In the picture, you can see the work-in-progress setup. There’s a cheapo DigiTech RP55 guitar pedal feeding back through a small mixing desk. I’m using a swell pedal to contral some of the parameters of the various fx from the DigiTech, particularly sweeping the pitch of the ‘whammy’ and ‘pitch shift’ functions, set up in various presets. The mixing desk is not entirely necessary, but the tone controls are useful to have in the feedback loop.

Below is the code for the SuperCollider patch. As always, my thanks to the developers of this software, and all the help received from the community on the mailing list.
~velbus = Bus.control.set(1); // not using yet
SynthDef(\pitchin, { | midinote = 60, gate = 1, amp = 0.1 |
var in, sig, env, ratio, trans, shift, sel;
trans = midinote - 60;
in =[0,1]);
ratio = trans.midiratio;
shift =, pitchRatio: ratio, timeDispersion: 0.1);
sel = trans.abs > 0;
sig =, [in, shift]);
env =, gate, doneAction: 2);, sig * env * amp * 37) // compensate for quiet;
~notes = Array.newClear(128); // array one slot per MIDI note
~on = MIDIFunc.noteOn({ |veloc, num, chan, src|
~notes[num] = Synth(\pitchin, [\midinote, num, \amp, veloc * 0.2/127]);
~off = MIDIFunc.noteOff({ |veloc, num, chan, src|
~cc1 ={ |val|
}, 1, 0 ); // cc1 on channel 0 = midi channel 1, not using yet

Patchin’ at the Hague

I’ve been at the Koninklijk Conservatorium in Den Haag for the last couple of days, working with a group of academics from all over Europe on the METRIC project – ‘Modernizing European Higher Music Education through Improvisation’. (If anyone can tell me in which language that acronym works, I’d love to know!)

While I’m here, I’ve been amusing myself with some work on a simple SuperCollider patch to pitchshift live audio, that I intend to use as part of my own improvisational practice. Particularly tickled to be able to say in future that I ‘worked on this patch at the Institute of Sonology in The Hague’, which is strictly speaking… more or less true!

Also brings fond memories of another piece of software associated with this institution, ACToolbox, that I used quite extensively in the past, although not so much now.

Let’s hear it for koncon!

Here’s some test code, part of a larger project:

// Institute of Sonology, Den Haag, 18 Mar 2016 :)
SynthDef(\pitchin, { | trans = 0, gate = 1 |
var sig, env, ratio;
sig =[0,1]);
ratio = trans.midiratio;
sig =, pitchRatio: ratio, timeDispersion: 0.1);
env =, gate, doneAction: 2);, sig * env);
x = Synth(\pitchin);
x.set(\trans, 12); // can sound a bit comby, try timeDispersion about half grainsize
x.set(\trans, -12);
x.set(\trans, 7);
x.set(\trans, -7);;

Gamelan Composers Forum

Yesterday I was in London with the Gamelan Composer’s Forum, an occasional collective brought together by Aris Daryono and Rob Campion to share approaches to writing for the gamelan. This particular event the series is entitled ‘The Intimate Gamelan’, and features three pieces written for a gadhon-size ensemble, performing in a private house in South London.

There were three new pieces. ‘Sang Empu’ (‘The Maestro’) was Aris’ piece scored for cello and ciblon, the cello part being taken by Alice Jones. Aris explains that this piece draws both on the cello tradition within kroncong, where the cello imitates the kendhang, and on the ciblon drummming for palaran.

Rob contributed another piece involving Alice on the cello, ‘New Moon’. This featured Rob playing the slenthem with two beaters, like a giant gender: I rather like imagining the cello in this piece as a giant rebab!

My own piece was a one-off assemblage of two of my ‘openings’, flexible bits of bits of material that I reuse in different ways for different performing occasions. The first part was based on an idea entitled ‘fibblestix’, an accellerating series of percussive clicks that eventually prompt a response form the gender. The second part of ‘Two Openings’ – as the overall piece is called – is ‘Adrift & Afloat‘ adapted for one pelog and one slendro gender. I’m very happy indeed with the way Rob and Aris approached this piece, very thoroughly prepared, and sounding very convincing indeed the intimate setting of a private house.

A fascinating evening: I’m glad I made the effort to get down to London to take part.

Working on new music for gamelan

Over the winter break, I’ve been working on some new music for gamelan. Following on from Naga Mas’ rather spectacular success with our gamelan-in-outer-space piece Gamelan Untethered, we have plans to do something along the same lines, but this time on an underwater theme. Below is a midi demo of something I’m working on for the group: a sort of sampak/kebyar fusion piece, pulling together some of the livelier ideas from the Javanese traditions with a Balinese-inspired melody.

This may prove a little tricky to play!

Live Coding and the Body

I’ve just spent a very stimulating weekend, combining an algorave in Brighton with the Live Coding and the Body symposium at Sussex University. A packed weekend: I didn’t keep notes of exactly who and what and when, so this is a rather chaotic reflection!

The algorave was in a clubby sort of room above a pub. Code from each performer’s laptop was projected on a black behind them, while some livecoded visuals in IBNIZ were projected on an adjacent wall. A good atmosphere, although I found the gig sometimes painfully loud, especially with those very forward and raw sounds which you sometimes get from coded synths.

The first set I caught was Chris Kiefer, working in SuperCollider (the old, pre-IDE version, looked like). This was excellent: very approachable danceable grooves, with a nice line in slowly changing fx. After that, Aneurin Barker Snook did a set in Tidal. It’s the first time I’ve seen this Haskell-based tool: to me it seemed quite firmly targetted at algorave, rather than a more general approach to coding music performance. A very harsh set, if I remember correctly, but still musical and interesting.

From this point on, my memory is a little blurred. A very abstract and glitchy set from Norah Lorway: not sure what tool she was using as she did not have time to plug into the projector. Also Charlie Roberts playing his own web-based Gibber language, with a nice line in livecoding the ascii text. Oh, and… Mico Rex – bonkers! In a good way: cardboard boxes, SC code, and exuberant singing.

The symposium programme turned out to be a little different from the published one: once again, this will be a bit disorganised as I didn’t keep accurate notes. A fascinating range of disciplines and interests in the room, including people working in music, fine art, architecture, dance, analog synthesis, social anthropology, and more. Some of the highlights I recall:

  • Marije Baalman gave a very, very cool performance of her piece ‘Wezen’, a tour-de-force of live SuperCollider code combined with sensor gloves.
  • Renick Bell’s presentation sparked off quite a lively debate, bringing to the table his reinterpretation of the pragmatic philosophy of John Dewey. Some deep questioning of his philosophical approaches ensued, led by David Berry in particular.
  • David Ogborn gave us two treats. The first was a multi-layered coding piece, with Pbindef’s in SuperCollider being generated by another piece of SC code and programatically ‘typed’ onto the screen, leaving David free to add in a solo part on electric guitar. On day two, he talked us through a stimulating range of people, ideas and references as a follow up.
  • Andrew R Brown talking about a spectrum of approaches to controlling algorithms, from text to gestural controllers, and points in between: also a nice demonstration of his own rig, using impromputu combined with a simple bank of sliders and switches on an iPad mini.
  • Alex McLean shared some of his work in collaboration with dancer Kate Sicchio.
  • Andrew Duff gave a lively and fun talk/demo on the modular synth scene, liberally illustrated by humorous gifs from the Muff Wiggler forums. A lot of gear porn here, really, including mention of the Pismo: ok, that’s it, I’m hanging on to mine, one day I’ll get it up and running again!
  • Cecile Chevalier drew some comparisons between livecoding and the work of Jackson Pollock, although as we unpicked that, perhaps a slightly problematic parallel…

Overall, a very pleasant long weekend in the very pleasant seaside town of Brighton. The Sussex University campus was very comfortable, as was the wonderful Dutch bike I hired from Amsterdammers.

Other things I learned about:

Final braindump, below an archive of all the #lcatb tweets (some of the attributions here probably wrong, collected this list in a rather hacky way…):

Great response – mod syths @yaxu suggests that we don’t react against the digital but instead software, which is ‘generally awful’ #lcatb
berrydm 12:18pm via Twitter for iPhone

Andrew Duff showing his own personal “Orac” and modular synths at Live Coding and the Body #lcatb
thormagnusson Jul 05, 2:06pm via iOS

Here @allthesixes666 presents a graph that should resonate with many computer musicians’ experience #lcatb
Retweeted by _sshaw and 4 others
thormagnusson 11:52am via iOS

#lcatb Afternoon kicking-off with @berrydm & Nick Rothwell coding as collaborative fist class discipline
Retweeted by musicSussex and 5 others
thormagnusson 12:08pm via iOS

Live coding the modular synths #lcatb
Retweeted by musicSussex and 3 others
thormagnusson 12:08pm via iOS

@ad_tpim Does ‘lowres’ modular syth demo at #lcatb symposium @SussexUni
thormagnusson 11:52am via iOS

Andrew Duff’s great modular synth setup at Live Coding and the Body seminar. Cool! #lcatb
pauwly 12:01pm via Twitter for Android

Andrew Duff @allthesixes666 blames his modular addiction on close friend @RussellHaswell #lcatb
thormagnusson 11:52am via iOS

#lcatb Pismo, very nice, I’ve already got one :)
pauwly 11:43am via Twitter for Android

Great talk by David Ogborn #Livecoding #acousmonium #spacialisation #emodiment #lcatb
tedthetrumpet 11:35am via Hootsuite David Ogborn offering some leads to follow #lcatb
renick 11:21am via Twitter for Android

@berrydm wish an AI were running traffic on the way to #lcatb today…
Show Conversation
pauwly 11:00am via Twitter for Android

Open questions about control through gesticulation and code in live performance #lcatb
lukechurch 10:55am via TweetDeck

Watching Andrew Brown merge gestural control via an iPad and a text programming language at #lcatb, we should add this to #DynamoBIM soon.
tedthetrumpet 10:51am via Hootsuite

#lcatb Andrew R Brown reflecting on the balance between textual and gestural control
berrydm 10:44am via Twitter for iPhone

Day 2- Live Coding and the Body on a Sunday (!?!) with Andrew Brown (gestural controllers & code description) #lcatb
TanyaMGoncalves 10:43am via Twitter for Android

#LCATB – sad that today is my last day of confrences #untilnexttime
tedthetrumpet 9:28am via Hootsuite

Day 2 of Livecoding and the Body starts soon: great to be here, meeting interesting people, and learning stuff #lcatb
TanyaMGoncalves Jul 05, 3:47pm via Twitter for Android

@d0kt0r0 performing live #LCATB
Retweeted by d0kt0r0 and 1 others
yaxu 12:31am via Twitter Web Client

Super discussions at the live coding at the body symposium, check out the #LCATB hashtag.
Retweeted by supersg559
yaxu 12:31am via Twitter Web Client

Soviet synthesizer bridged occultism and electronic music — will be of interest to #lcatb…
thormagnusson Jul 05, 2:06pm via iOS

#lcatb Afternoon kicking-off with @berrydm & Nick Rothwell coding as collaborative fist class discipline
Retweeted by giovamusic and 5 others
TanyaMGoncalves Jul 05, 5:10pm via Twitter for Android

@yaxu @yaxuprime Alex McLean #LCATB http://
shelly_knotts Jul 05, 5:10pm via Twitter Web Client

nice to have a few controversial comments – referring to live coding as a gendered practice is fairly problematic #lcatb
tedthetrumpet Jul 05, 5:04pm via Hootsuite

#lcatb interesting discussion around livecoding compared with the work of Jackson Pollock
cappelnord Jul 05, 4:58pm via Twitter Web Client

#lcatb people: Thanks for tweeting!
kunstwissen Jul 05, 4:36pm via Twitter Web Client

Micorex ask: why should a live coder be on stage? after presenting some of their practice: hitting buttons, singing, dancing… moving #lcatb
Retweeted by sicchio and 1 others
danny_bright Jul 05, 4:28pm via Twitter for iPhone

Awesome – cardboard boxes full of wires and joysticks – surely the best controllers #lcatb
Retweeted by livecodenet
kunstwissen Jul 05, 4:40pm via Twitter Web Client

@kunstwissen some responses:challenging context of performance, sharing experience, put emphasis on coding as a cultural practice #lcatb
Retweeted by livecodenet
kunstwissen Jul 05, 4:36pm via Twitter Web Client

@d0kt0r0 performs with a multi channel live coding demon which is //not evil #lcatb
Retweeted by tedthetrumpet and 1 others
kunstwissen Jul 05, 4:36pm via Twitter Web Client

A brilliant presentation by @cassieldotcom on code and choreography. #lcatb
Retweeted by livecodenet and 6 others
shelly_knotts Jul 05, 2:33pm via Twitter Web Client

next symposium: ‘Live Coding and Socks’? ;) #lcatb
Retweeted by livecodenet
pauwly Jul 05, 3:00pm via Twitter for Android

We’re flying now – forget Deleuze #lcatb
Retweeted by livecodenet
pauwly Jul 05, 2:39pm via Twitter for Android

Hester Reeve – Why might we choose to use the body live? The audience as witness and not consumer. #lcatb
Retweeted by livecodenet and 1 others
berrydm Jul 05, 3:46pm via Twitter for iPhone

Next Marije Baalman live coding a set #lcatb
Retweeted by livecodenet
TanyaMGoncalves Jul 05, 3:47pm via Twitter for Android

#lcatb David Ogborn performs – #guitar meets #SuperCollider #generative #code
Retweeted by livecodenet and 1 others
danny_bright Jul 05, 4:15pm via Twitter for iPhone

Lots of interesting people, talks, presentations and performances so far at #lcatb @musicSussex
sicchio Jul 05, 3:36pm via Twitter for iPhone

Performance #2 Marije Baalman – making the code sweat! #lcatb
Retweeted by livecodenet and 1 others
pauwly Jul 05, 4:09pm via Twitter for Android

@d0kt0r0 performs with a multi channel live coding demon which is //not evil #lcatb
Retweeted by shelly_knotts and 1 others
TanyaMGoncalves Jul 05, 3:47pm via Twitter for Android

@d0kt0r0 performing live #LCATB
berrydm Jul 05, 3:46pm via Twitter for iPhone

David Ogborn playing guitar & live coding set – interesting use of live generated code #lcatb
sicchio Jul 05, 3:36pm via Twitter for iPhone

Trying to keep up with Live Coding and the Body tweets #lcatb
tedthetrumpet Jul 05, 3:31pm via Hootsuite

#lcatb surrounded by all these coders, I should probably just *ask* someone why my Pbindef won’t .play until I .stop it :)
thormagnusson Jul 05, 2:06pm via iOS

Hester Reeve – Why might we choose to use the body live? The audience as witness and not consumer. #lcatb
Retweeted by berrydm and 1 others
pauwly Jul 05, 2:39pm via Twitter for Android

A brilliant presentation by @cassieldotcom on code and choreography. #lcatb
Retweeted by readywriting and 6 others
thormagnusson Jul 05, 2:06pm via iOS

A very provocative slide #lcatb
Retweeted by livecodenet
TanyaMGoncalves Jul 05, 1:49pm via Twitter for Android

Chats about #livecoding, I’m having a little too much fun right now. #LCATB
Retweeted by livecodenet
pauwly Jul 05, 1:52pm via Twitter for Android

Here is the programme for the conference Live Coding and the Body 2014 #lcatb
Retweeted by goldsmif and 2 others
tedthetrumpet Jul 05, 12:14pm via Hootsuite

#lcatb Renick Bell is trying to get the body *out* of performance (!?)
Retweeted by livecodenet
tedthetrumpet Jul 05, 12:14pm via Hootsuite

Now we have Renick Bell on pragmatic aesthetics ad live coding #lcatb
Retweeted by livecodenet
berrydm Jul 05, 11:00am via iOS

Heideggerian phenomenology might give some useful notions for conceptualisation of live coding practice see #lcatb
Retweeted by livecodenet
berrydm Jul 05, 11:54am via Twitter for iPhone

@livecodenet new discipline love coding? #lcatb
Retweeted by livecodenet
berrydm Jul 05, 11:48am via Twitter for iPhone

“@_TheTerminator_: @berrydm I am a cybernetic organism. Living tissue over a metal endoskeleton.” #lcatb
Show Conversation
kaoskorobase Jul 05, 11:45am via Twitter for iPad

I warned you.. @_TheTerminator_: @berrydm Skynet became self-aware at 2:14am Eastern time, August 29. #lcatb
Show Conversation
pauwly Jul 05, 11:38am via Twitter for Android

Now discussing livecoding blind & connecting machine directly to brain function..its a little after 11am ;D #lcatb
Retweeted by livecodenet
pauwly Jul 05, 11:38am via Twitter for Android

So @thormagnusson outlines his scary brain control live coding ideas. Skynet will soon be here via ixi lang #lcatb
berrydm Jul 05, 11:24am via Twitter for iPhone

Live coding and contextual computing – interesting possibilities for new frameworks for creativity and sonic arts #lcatb
livecodenet Jul 05, 11:22am via Twitter for Android

Marije Baalman providing analysis of love coding interaction loop. Wonders about live coding with eyes closed #lcatb
thormagnusson Jul 05, 11:01am via Twitter Web Client

A fantastic group of people have convened in the Creativity Zone at Sussex to discuss Live Coding and the Body #lcatb // @livecodenet
Retweeted by livecodenet and 6 others
berrydm Jul 05, 11:15am via Twitter for iPhone

How obduracy of the computer can be used to feedback electromagnetic waves into music & sound- Laptop Music (Reuss) – v. Cybernetic #lcatb
TanyaMGoncalves Jul 05, 11:06am via Twitter for Android

berrydm Jul 05, 11:00am via iOS

Thanks for prompting hashtag @berrydm ;D Here we go, going to be great! #lcatb
berrydm Jul 05, 11:03am via Twitter for iPhone

First speaker is Marije Baalman #lcatb
thormagnusson Jul 05, 11:01am via Twitter Web Client

“Michel [Waisvisz] was only interested in code if he could make it sweat” (Sally-Jane Norman) #lcatb
tedthetrumpet Jul 05, 11:02am via Hootsuite

#lcatb so far I’ve learned how to pronounce ‘NIME’ and ‘Gibber’
tedthetrumpet Jul 05, 11:01am via Hootsuite

Sally Jane Norman introduces the first panel #lcatb
Retweeted by thormagnusson
berrydm Jul 05, 10:58am via Twitter for iPhone

We have a hashtag for the Live Coding and the Body conference #lcatb @SussexUni
Retweeted by thormagnusson
berrydm Jul 05, 10:59am via Twitter for iPhone

About ‘Why Scotland, Why East Kilbride’

I’ve been calling myself a ‘composer’ for about twenty years now, starting from when I went back to music college as a trumpet player but got sidetracked into writing music instead of playing it. On my website you’ll find early pieces like ‘Rate-limiting Step’ for cello & harp, or ‘Studies of Nucleate Boiling in Thin Liquid Layers (Part 1)’ for chamber ensemble. This is what people usually think of when I say I’m a ‘composer’: someone who writes contemporary score-based music for classically trained instrumentalists to play.

Yes, I’ve done a lot of that: but over the years it’s come to seem less and less interesting to me. For a number of years now, I’ve been working in a way which is much more akin to devised theatre, or even performance art. For a start, I hardly ever write the music down now, or if I do, it’s just fragments, starting points. By convention, a contemporary classical composer is expected to put everything in the score: the written score is the piece of music. You could post it off to an ensemble anywhere from Albuquerque to Zhytomyr, turn up a week later, and boom, there’s the piece being played.

Or at least, that’s the theory. But, take the very first work in my catalog, ‘The Knowing of Things Together’. It’s ‘scored’ for didjeridu, three flutes, three trombones, conga drums & two wine bottles (red). Already, there’s a problem: that’s not a standard ensemble! So, actually, no-one is ever going to play this piece, apart from the people I was working with back then. So, really, the music is what the musicians and I actually did on that occasion, rather than anything I might write down.

Fast forward. The piece that’s about to come to fruition, ‘Why Scotland, Why East Kilbride’, started life as a dream. This happens to me a lot: I’m a musician, after all, and I have dreams where I hear a piece of music. Quite often when I wake up I can remember at least some of it: I rush to write it down or even just sing it into my phone before it goes.

The piece I heard on this occasion – it was the morning of the 3 June 2012 – seemed to be for… a rock band? Plus a section of orchestral French horns?!?

Made some notes, forgot about it. Couple of days later, aimlessly surfing around as you do, I stumbled upon this 1972 public information film about East Kilbride. Suddenly a whole chunk of memory descended upon me. I remembered the two different times in my life when I lived in East Kilbride, particularly a period in the eary 80s where I was busy dropping out of an ill-advised science degree at university, listening to a lot of Hawkwind, and teaching myself to play guitar. Suddenly, the dream made sense: I knew what I was listening to, I knew what it was all about.

And the Ted Edwards stuff? A misdirection, perhaps. Edward ‘Teddy’ Edwards is a fictional alter ego of mine. He’s the person I would have liked to have been if I’d been born forty years earlier: he’s Raymond Scott, he’s Daphne Oram, he’s my dad’s golfing buddy who owned a radio shop, he’s Erik Satie. Or maybe he’s a complete unknown: he’s Ziggy Elman, he’s the guy who first came up with the npn-pnp astable multivibrator circuit, he’s some guy who’s into birdwatching.

He’s a useful vehicle: someone I can have in my show, someone who might have been in East Kilbride in 1972, might have played in a rock band, might have had a day job as a chemist at the National Engineering Laboratory. Someone who might, after all, be a composer.

Kind of.


Electronics nostalgia

Very happy today. Been working on some electronics for the show. For me as a creator, this is a big part of what Why Scotland, Why East Kilbride is about: exploring my nostalgia for teenage evenings spent with a soldering iron, a cup of coffee, and a Hawkwind album.

Some pics below: trying to get my SN76477 prototyping station back up and running.

01 sn76477 overview 02 sn76477 board good 03 sn76477 front panel ok


Some of the publicity put out for the show has perhaps been unintentionally slightly misleading! I did have a dream of having a French horn section in the show, but that dream has been realised through… technical means :)

Instead of live players, I’ve invented the ‘Horn-a-Tron’. This is a Pd patch which plays back midi horn sounds alongside video clips of sixteen horn players – with thanks to Steve Park for the horn vids. I’m not going to put up a clip of this working just yet, spoil the effect. But here’s the patch:

Pd 'Horn-a-Tron' patch

Why Scotland, Why Strathclyde Uni?

A couple of interesting connections between the show and the University of Strathclyde. The first is – and I didn’t spot this till the other day – it’s actually mentioned in the original movie. At around 16 minutes, during the segment where Bruce and Mark visit the National Engineering Laboratory, there’s this exchange:

NEL Employee: We’ve got about three hundred and forty engineering graduates and scientific workers here, so we can tackle all sorts of engineering problems.
Bruce: Is there any basic research?
NEL Employee: Well at the moment it’s about twenty percent, I suppose, it used to be more but we’re now tackling industrial problems. We have close links with Strathclyde which is one of business leading technical universities.

The other connection is of course Dr Steven Ford, who, as well as being one of my bassists, is going to be doing the live chemistry. Steve’s day job is as a Research Fellow at Strathclyde’s Cancer Research UK Formulation Unit. For the show, he’s having fun trying to mock up some outdated (and dangerous!) liquid handling practices, as well as making some interesting chemical smells. As someone who nearly became a scientist myself – I started a BSc at Edinburgh – I’m delighted to be able to get this kind of work into the project.

(In fact, I almost went to Strathclyde myself… I remember going to their open day, and a lot of my friends ended up studying there. And of course, many, many years after that came the Invention Ensemble, which arose out of Strathclyde’s wonderful BA in Applied Music course… )