Phil Durrant is familiar to most Reaktor users as Sowari on the NI messageboards, a helpful and knowledgeable presence there. He’s also a renowned experimental musician, having performed with, among others, Ticklish, Trio Sowari and MIMEO, the music in movement orchestra – a collective that includes Christian Fennesz, Peter Rehberg and Keith Rowe. Phil is currently working on his PhD dissertation, which involves building virtual instruments for improvised laptop performance. I interviewed Phil by email shortly after Trio Sowari performed a series of shows in France. Phil not only answered my questions but shared a screen shots of the Reaktor instruments he uses in live performance, with detailed explanations. Here’s what he had to say.
Peter Dines: Tell me about your PhD project. What is the guiding philosophy of the tools you’re creating in Reaktor and Max? What are the criteria for a successful instrument in this endeavour – e.g., to facilitate live performance, to offer novel avenues for composition, etc.?
Phil Durrant: My project conceives of the musician-and-instrument as a single entity, focusing on the potential for interaction between various musician-instrument entities in live improvised performances. My aim is to build virtual software instruments for performance, informed by analysis of practice among improvising musicians and theoretical explorations of body-machine interaction.
So what do I mean by that?
In terms of improvising with other musicians, I feel I need to really know my instrument inside out, to such an extent that it is part of me. I don’t see the musician and instrument as separate things.
I will be focusing on making instruments that are flexible and easy to manipulate, so that I can concentrate on how I (as a musician-instrument-entity) can interact (with other musician-instrument-entities) in improvised music performances.
In the past, I used a MIDI fader controller to change parameters in Reaktor. But for one important gig, I forgot to bring a vital part of that setup and couldn’t use it. Since then, I have concentrated on programming flexibility into my performance Reaktor Ensemble, where I use only qwerty keys and the trackpad as controllers and triggers. Because of this, I have become increasingly interested in how normal everyday physical interactions with a computer can be utilized to create an expressive virtual instrument.
I will also be exploring various concepts of flow (between musicians, and between musicians and instruments and also within the instrument itself), and of course concepts of the real and/or the virtual. Theorists such as Deleuze and Guattari and Brian Massumi (among others), will inform my research on these topics.
In practical terms, I will build 3 virtual instruments and share them with the Digital community. The first will be built with Reaktor. The second instrument will use Max/MSP to increase the possibilities and potential of using (all of the) qwerty keys and the trackpad as the only controllers. The third will incorporate the use of widely available USB games controllers. For this instrument, I will also use Max/MSP to convert the data into usable information to control parameters in Reaktor. Throughout this process, I will release ‘lite versions’ and betas of the instruments to get feedback and comments from the wider digital community.
Peter: About the laptop as tactile interface – I know! It’s too bad that people often neglect what’s right in front of them. Laptop trackpads are especially delightful for use with an XY or mouse area module. But maybe the ubiquity of laptops – you have to have one even to run Reaktor in the first place – makes them less exciting as performance devices. They lack snob appeal.
Phil: Seven years ago it was really hip to play laptop at gigs; now it is really unhip. Part of my PhD is to prove that the laptop can be a really responsive and expressive instrument. It is all down to programming and practice. The trackpad is just so important, and so are the page keys for small movements.
Peter: What are some of your musical influences?
Phil: My influences are varied and reflect a diverse career where I have performed as an improvising violinist, (sometimes with the use of hardware electronics); worked as an artist/producer/sound designer on the London techno/breakbeat scene; written and performed with Ticklish (an ‘Electronica’ audio/visual project); and my current work with the post-reductionsist group Trio Sowari and the international electronic orchestra, MIMEO. So some of my influences include guitarist Derek Bailey, trumpeter Axel Dörner, Microstoria, early Stockhausen (especially Kontakt), and many of the concepts of John Cage. However, the biggest influence on my music has always come from the people that I work with.
Peter: What are you trying to accomplish musically when playing live? To convey a mood or state of mind, to mind meld with the other performers and improvise, to present previously composed and orchestrated material…?
Phil: I hate the old modernist ideal of having to express one’s own inner feelings to the audience. I agree with Deleuze and Guattari (arch post-modernists) when they say that art is about desire. By that they mean the simple wish to produce something that excites the person who is creating it. In terms of performances, I also want to create space so that other musicians can do their thing as part of a group sound. So that as a group, there is no single hierarchical structure. I feel that we should all be equal no matter which instrument is being played. The material that I present in a performance can be quite old or quite new… but it has to work in the context of the performance, and has to be changed/morphed, to fit that context.
Peter: That actually reminds me of the way a guitar player would approach a jam – there are certain standard moves, even personal signature moves, in the player’s vocabulary, and he adapts them to the situation. But in the case of an electronic musician only part of the signature style resides in the player. And then you have to wonder how much comes from the technology. On the other hand with a programmable system like Reaktor or Max, the technology takes on much of the flavor and intent of the builder / composer. You address some of this in the form of your Stockhausen in my System ensemble, where your instructions ask “…but who is really in control?”
Phil: For me, all instruments impose their own “technology” on the player, whether it is a violin or a laptop. In improv, many musicians have tried to impose their will on the technology. With a guitar, the obvious one is the use of FX, In addition, quite a few guitarists (Keith Rowe, Jim O’Rourke, Fred Frith among others) have put the guitar on a table (with loads of fx) to change the interaction possibilities and therefore make different sounds with a hybrid instrument rather than guitar plus effects. For me, they are builders/improvisers/sound designers. In some ways I am following in that tradition. Before I bought my first Powerbook, I would do gigs with a table full of stuff including small samplers (Zoom, Yamaha SU10) and fx (Ensoniq DP/4, T.C. Electronics Fireworx, Sherman Filterbank). The laptop enables me to create a modular structure that can incorporate some of the systems that I developed with my hardware rig.
Re: Stockhausen in my System. I have actually used it on gigs a few times, and it is something that will probably influence my building in the future. However, new versions will definitely have the choice of whether to stay on a patch and/or randomise (the Mouse Area with its left/right/double click choices is great for that). But the creation of this instrument (which was part of my MA, and so was the Steve Reich thing) was partly about looking at my personal issues with randomisation. I tend not to like the idea of pressing a button and everything changes. The Stockhausen Ensemble just changes (with a fresh click) to a different patch, that has been pre-programmed. My comment about “who is in control” was partly about these issues with randomisation and also jokingly asking whether machines are controlling our lives. In addition, it had a little bit to do with musicians putting some unstable/dangerous elements into the equation. Feedback is an element that loads of musicians use, but you can at least have a chance of controlling it. These are the sorts of randomising elements that interest me.
Peter: How does Reaktor factor into your playing and composition? Do you have one major ensemble or group of ensembles that you use consistently, or do you build new contraptions as part of the compositional process?
Phil: Reaktor has been, and continues to be the main factor in terms of my performances, but also the creation of my sounds. I have recently purchased Max/MSP which will take on a more prominent role in terms of how qwerty keys, trackpads and USB games controllers, can be used to control parameters, but I still see these parameters being Reaktor ones.
For my ‘Free Improv’ gigs, I do have one major ensemble. A few years ago I used 3 or 4 ensembles with different characteristics. However, I realised I was not getting to know them well enough, so I decided to concentrate on just one ensemble. That particular ensemble has evolved over 7 or 8 years.
The current version consists of two (heavily modified) samplers that were part of the Reaktor 2 Essential Library. Plasma uses the Grain Resynth sampler and Triptonizer uses the Grain Pitchformer sampler. [note: unmodified versions of Plasma and Triptonizer are available on the Reaktor 5 install disk - look in the "legacy library" folder and try hacking them yourself - Peter D.] Both samplers loop grains within an audio sample, so you can “scratch” a sound file, and just loop a tiny fragment. In addition, I have added effects to of these samplers, as separate instruments. The effects include a pitch-shifter and a grain delay, also filtering, bit reduction and ring modulation.
Here is a screen shot of my modified Plasma:
On top of the waveform is a Mouse Area. The red line shows the current place within the sample, and the blue is the grain size. At this position the size is very small, but if I scroll down so that the blue line is seen at the bottom of the waveform, it would be quite large, and as a result, you could clearly hear a repeated part of the sample of over one second. To make things more interesting, Plasma has random jitters that cause fluctuations in the position of the sample and size of the grain. To enhance this, I have integrated a modulation envelope (just with attack and release controls), that is triggered by a geiger oscillator. You can see the controls in the Geiger Env Jit macro. This adds further modulation possibilities for sample position and grain length.
Here is a screenshot from view A of this Instrument :
This concept of “Scenes” is something I learnt from my favourite Reaktor builder, Martin Brinkmann. On the left side you can see 10 macros labelled S 1 to S 10. Each macro can be triggered by the 10 buttons or by qwerty keys (Q to P). That means, I can use qwerty keys (or MIDI notes) to select any one of the 10 samples. The macros labelled Pch1 to Pch8 select different transpositions of the selected sample. Thus I can create transpositions that are not governed by the pitch of a specific MIDI note. These pitch “scenes” are triggered by qwerty keys Z to M. Finally, Rnd1 to Rnd8 are different random amounts for the offset position of the sample. So with a combination of using the Mouse Area to control 2 parameters and the various key presses it is possible for me to achieve a lot of variety just by using my laptop without any external controllers.
Here is a screenshot of the Slider Instrument:
Again there are scenes, 16 of them chosen by the Sc Sel Mouse Area. These scenes store different positions of 8 sliders. The sliders control position and grain length; transposition; random amounts for position and grain length; sample select; and pitch and rate of the pitch-shift effect. So potentially, I can morph 8 different parameters by choosing different scenes just with the trackpad. With the 6 buttons, I can choose whichever combinations of the 8 sliders I want to activate or not. The sliders idea is taken from Tim Exile’s Vectory [included with Reaktor 5], and Joachim Schneider’s touch matrix.
Going back to Plasma, the XY labelled Pitch/Rate changes parameters of the pitch-shift effect, which (in addition) can also be morphed by sliders 7 and 8. Pre-recorded movements of these pitch-shift parameters can be triggered by qwerty key 6. Thus I can use the trackpad to morph the position and grain length of a sample, and trigger morphings of a pitch-shift effect at the same time. Playing the Mouse Area with the trackpad (left mouse button) causes a sample to be played with no effect. Playing the track pad with Control held down (right mouse button) causes the playback of a sample to be routed through the pitch-shifter effect.
In terms of composition and sound design, I do use a variety of ensembles, including the ones that I have uploaded to the User Library. Most of my uploads reflect the more ‘electronica’ side of my musical personality and have either been used when I work with Ticklish, or when I compose music and do sound design for my collaborations with choreographers.
Peter: What would you like to see in Reaktor 6?
Phil: I would like to see the reintroduction of the split library. Reaktor 3 had the Premium Library, which consisted of the cutting edge, bells and whistle creations. However in Reaktor 3, I also loved the Essential Library, which had a whole bunch of uncomplicated designs that could be easily modified by budding builders.
Peter: Yes, absolutely. I think R5 suffered from the lack of easily hackable learning structures. And when I mean suffered, I mean the community of builders lost some potential growth. The factory library became brilliant but inaccessible. I got into Reaktor at the tail end of the R4 era and most of my learning was done on the R3 instruments, the very simple Soundschool VA synth, and some user uploads. I first began using sequenced tables by cutting the sequencers out of your own Anthony is 12 ensemble.
I’m fascinated by the way your personal gear is an archaeological cross section of Reaktor history – you’ve got basic designs from R3, parts from R5 ensembles like Vectory, and scene changing ideas from Martin Brinkmann. There’s something appealingly organic about that. It’s a way of using Reaktor that I think should be promoted more.
Phil: As I said in the Reaktor Forum recently, I feel recent developments (since Reaktor 5) are either aimed at the hardcore DSP people or artists/producers who want ready made sonic devices. The recent uploads by Stephan Schmitt are great, but they are being aimed at users who need very high class Filters and Oscillators and users who want to really get deep into programming with Core. On the other hand there is the all singing dancing music making tools for people who don’t want to have to build stuff. Their appetite for more things along these lines, has been steered towards Kore and the Soundpacks, rather than developing Reaktor.
I feel that the people in between are being ignored. These are creative artist/producers who like designing their own sounds. Sometimes that means building their own instruments but also changing instruments made by other builders. NI should have developed the Classic Modular Macros concept (which could also include Core Macros), by increasing the library and providing proper documentation to show how these macros can be used. Reaktor should be easy to build in a “lego” style and more needs to be done in this direction. The rb_macro series was created by Reaktor users, to help with this.
I would like to see a much bigger collection of Core macros, and decent documentation about how to use them. In fact, the whole Reaktor documentation needs improving. [Actually, that's part of what I'm trying to do here at Noisepages! - Peter D.] Max/MSP version 5 has got brilliant embedded documentation and tutorials.
Peter: What are some of the unknown gems in the Reaktor user library?
Phil: One of my favourite effects, is Shred, by Christophe Lenz. I use it in my live ensemble. It is a lovely weird dubby granular delay. Another great effect, is Ingo Zobel’s herMan FiLterbank. Lilthree by Tim Schwerdtfeger has 3 great effects, and I also love Freaky Shifter, by Sean Costello.
In terms of noise machines, I like LazyFish’s StereoDigitalRandomNoiseSource [highly recommended but be careful of your ears and speakers - Peter D.], Metalon by Chris Malcolm, and Madbox by Jo Anning.
However, there are quite a few builders who I really respect, but I will just mention three long time favourites. They are: Martin Brinkmann who I have learnt so much from whenever I reverse engineer his stuff; Dieter Zobel, who more than anyone else combines great ideas and building, with a rare musicality; and Siegmar Kreie, who has really pushed the Reaktor envelope. Check out his wonderful Organol and the classic Weedbacker, (also known as Weedwacker). [Weedwacker is also included in the Reaktor 3 library available on the Reaktor 5 install CD - Peter D.]
Whew – is your head spinning, reader? Mine is. After you’ve digested all that, be sure to check out the library instruments Phil recommends – I’d never tried Organol and it’s a real treat. I’d love to use this wide ranging interview as a series of jumping off points for further exploration. Try out the Plasma and Triptonizer instruments – are you interested in learning to hotrod them yourself? What do you think of using the laptop as an instrument in itself? Let us know in the comments!