I posted an obscure photo last week showing set-up for a lecture I gave the following morning. I’m interested to know if anyone else has tried anything like this in music education, because it seems like a big technological uphill to get there (tho “there” is quite geekishly cool). I’m thinking I might write at length and give this some context within the literature if someone doesn’t just say “Humberstone, you idiot, that’s not innovative, that’s just a stupid way of doing something”. So the problem set was to take a mish-mash of devices, as many of us have individually and in our classrooms, and see if they could be used to improvise, perform and compose. I decided to go with Live because having seen Tim Shiel’s amazing LiveSchool presentation about how he redeveloped Gotye’s last album for performing in Ableton Live, and I figured even if I don’t have Tim’s chops (by a long way) at least I knew the software could do it. Here’s a (hyperlinked) list and some photos of the gear I had on hand:
- Ableton Push (Akai)
- Akai LPD8
- Arturia BeatStep
- M-Audio Axiom Pro 49
- LaunchPad Mini
- Steinberg UR22 audio interface
- D-Link 7 port USB Hub
- Apple Airport Extreme running a private network
- MacBook Pro running OSX 10.9.3 “Mavericks”, 16GB RAM, 1TB SSD, with Ableton Live Suite configured to see Lemur Daemon and touchAble client as MIDI ins.
- 15 iPads (mix of iPad 2 and iPad Airs) connected to the wireless network with apps:
Obviously the hardware controllers are easy to set up because they each appear in Live’s MIDI dialog in settings, and will each show as a separate MIDI in, so you can easily ascribe each to a single track (and therefore set that track to one sound). You can arm them or not, and so students can improvise together and just some of them be recording new material.
Getting the iPads to work is a little more fiddly. Once you’ve done the set-up it works, but I found that if they’d slept for a while or been reset there was a good chance that they’d lose their MIDI in in Live, or if you connected one to a virtual MIDI in that another was using, it would be “knocked off”. This means if students go mucking around with settings you’ll have a load of interruptions. It also means you’ll probably be setting up in class time, which you’d normally avoid. This is the problem I’d really like to solve. Anyway, if I haven’t put you off giving it a go, here are the steps:
Set up Lemur by:
- Downloading and installing the Lemur daemon (app) on your Mac
- Dragging the LiveControl document (this will have been copied to your Apps folder on your Mac) to the Lemur app on your iPad via iTunes on your Mac
- Going to settings in Lemur on your iPad and selecting the LiveControl settings from the Project List
- Tap More Settings and under MIDI 0 select the same Daemon Output and Input number from your Mac. *Note* do this in a really logical way, and work out some way to keep track of which iPad is which MIDI number, such as putting a sticker on the back of the iPad – so you know which iPad is which “instrument” in Live.
- Tap Done, then the Play pad
- Repeat steps B to E for up to 8 iPads (because you have Daemon inputs 0 to 8).
- Set up touchAble by:
- Downloading and installing the touchAble client on your Mac
- Running touchAble on your iPad, and tapping on your Mac client
- Tap on the session button twice to maximise it
- Repeat for any remaining iPads (who will share session control)
- Set up Ableton Live by:
- Going to preferences and making sure that Live sees all of the various MIDI ins and that they’re all switched on under Track.
- Adding a MIDI track for each instrument – that means one for each hardware instrument and one for each of the 8 Lemur Daemon inputs (if you have 8 iPads running).
- Dragging a different Live instrument to each MIDI track so each person is improvising with a different sound. Spend some time thinking about how the sounds will interact texturally, and also sound that work over a wide pitch range (high to low) so they’re not competing in the same frequency range. You can also pan etc. to create “space”.
- Test all of the inputs! Each one should play a separate sound. touchAble can also be set to play an instrument, but because it doesn’t have the multiple ins that Lemur has, I used it to control the whole session view, so some students were “DJing” the whole thing rather than performing, improvising or recording. We allocated a few tracks to each DJ.
Now you’re ready to jam! The process that I used was to show students first how to do drum programming on the Push, and used the duplicate feature to create a series of drum patterns, and then added in live drumming from the push and other drum pads like the Beatstep and LPD8, quantizing as we went. So we now had 4 or 5 live percussionists either playing live or recording loops and adding to them as they went (I facilitated this in Live).
Having created a load of percussion sounds, we agreed on a key/mode for the Lemur and other pitched controllers (e.g. MIDI keyboards), and went around the room each recording a series of ostinati in that mode one after another (again, I facilitated this in Live).
Next the DJs took over, running existing material and experimenting with it in different combinations, as well as arming and disarming tracks for additional live improvisation over the top (like conductors of an ensemble). To be honest, it wasn’t what I’d call amazing, but it was a great experiment, and I think with more sensitive use of the technology (and more bullet-proof MIDI-in from Lemur) something really unique could be achieved. Here’s a really short soundbite of what the students created….