Posted Reaction by PublMe bot in PublMe

BT goes deep on his plugins: “There isn’t a producer on the planet that won’t want this”Words: Sam Willings, Oliver Payne
BT is truly a music production polymath. That word gets thrown around a lot, but how many other Grammy-nominated producers and audio engineers are, writing code for their own plugins, starting software companies, actively working for a fairer future for musicians and writing critically acclaimed music and soundtracks for movie and video games?
The 52-year-old has more projects on the horizon, too — new music, new plugins and new technologies are expected from BT, real name Brian Transeau. We caught up with Transeau for the first time in a few years, after appearing on our podcast and in another interview, to see what he’s been up to. Alongside his latest release with Shingo Nakamura, Lifeforce on Monstercat, it turns out he’s been up to quite a lot.

It’s been four years since we last spoke to you about music and synthesizers — what’s new?
Without too much of a reveal, we’ve spent the past 18 months starting a new software company, SoundLabs.
We closed our pre-seed round and have been diligently building some wildly new both traditional DSP and machine learning/AI technologies into a suite of plugins. I literally can not live without them at this point. They are totally game-changing technologies, none of which are public-facing technologies available to producers currently. To say I’m excited about them is the understatement of the century.
Image: Lucy Transeau
But, in addition to these three plugins, I’d say another plugin I can’t live without is Oeksound Soothe2. It’s one of the most useful plugins I’ve ever used. Every new version of that is an essential level-up in production for me.
Another essential I should mention is my collaboration with CableGuys. We’ve been working on a plugin for almost seven years (can’t believe it) that has changed shape a couple of times. The final (new) release candidate is something so simple and elegant, that there isn’t a producer on the planet that won’t want this, regardless of the type of music you make.
You’ve got a lot of studio gear. At what point in your production process are plugins making an appearance? Is it when creating the music or only in editing, mixing and mastering?
My DAW autoloads/templates have hundreds of plugins in them — from bus mastering to channel strip, limiting, compression, and spectral sidechain configurations. So they’re there from the start.
Here’s an example: the Roland Jupiter-8 is normaled through an EAR 660 compressor into a Lynx Aurora(n) interface. I can break this normal, but they are such a lovely pairing. In the autoloads, there’s a Softube and Plugin Alliance channel strip configuration and starting EQ points. This channel normals to the synth bus with a clip limiter, some saturation plugins and out to stereo compression (outboard) and finally to a Trackspacer with spectral side chain normaled to my drum bus.
Image: Lucy Transeau
The setups are so complex and refined that I really can sit down and write, and it just sounds like a record.
Once things are taking shape — I frequently do stem mixes — RX spectral artefacts, HPF, general audio repair and design and then flip all stems into a clean (different) autoload that is audio only. This is where more creative sound design plugins come into play.
My Cubase/Vienne Ensemble Pro setup is an interview in itself. It’s 3,200 instances of Kontakt and a plethora of software instruments summed to a 32-track stem template. It’s taken about 12 years to build to this point. It would be a fun one to show at a point.
You and Shingo Naramuka have created this incredible airy, elevated sensation in ‘Lifeforce’. Can you tell us more about its creation?
Shingo is lovely and such a talent. We had as much fun hiking, eating and talking as we did in the studio and it’s really reflected in the sound of that record. It was fairly effortless.

Shingo had a sketch we fleshed out with vintage synthesizers and then I wrote a new drop progression with some Spitfire Felt pianos (and crazy outboard signal path – I remember an Ensoniq DP4, Roland SRV-330 and 737 compressor were involved). I spent a lot of time on this one doing hand editing to final stems in RX before we had a master-ready copy we were excited about. Some of my secret plugins were used on this one as well.
You’re an advocate for – and collaborator with – iZotope. Which are your go-to iZotope products, aside from your own?
For me hands down RX and Stutter Edit 2. I absolutely LOVE the new Trash as well and am so excited they made that. I use that all the time.
Tell us about BreakTweaker and Stutter Edit. What was your role in the creation of these plugins and how do you use them yourself?
Well, this is a long story but my role, broadly, was showing up at former iZotope CEO Mark’s door with a million-line code base and two finished products. I developed these to completion myself and then licensed them to iZotope.
Image: Lucy Transeau
We’ve had a wonderful and fruitful relationship and we worked closely when (I went off to do the same again with Stutter Edit 2). I love developing music software as I find holes in my own personal creation process since I started making music professionally, that there is a needed new ‘thing’. It’s become one of the thrills of music creation for me, creating the tools to augment the process.
I’ve been programming since I was a kid — starting with BasicA and, believe it or not, Fortran. I studied Csound (my main prototyping language and what I have built all my generative music blockchain projects in) under Dr. Richard Boulanger at Berklee (formerly MIT). And regularly prototype in Csound, MAX/MSP, Pure Data and sometimes now just directly to JUCE.
I’ve gone pretty deep down the Python wormhole over the last three years in our AI plugins, too, so I can hack around in there pretty well now too.
Point is, in creating plugins, I’m not just dryboarding and delegating. I like to get my hands dirty doing the fun stuff too — coding.
iZotope RX in BT’s Studio. Image: Lucy Transeau
How are you feeling about the rise of AI implementation in music production tools?
I want to give a very nuanced answer here. I’m a strong believer that the future of music is human. I also strongly believe in consensual, ethically trained AI (across the board) and there are very few companies respecting intellectual property rights in this space. Some are flagrantly flaunting their first-to-market status from trading text to music models (which we don’t believe is a product) by scraping Spotify and YouTube.
I find this an abhorrent misuse of this technology, one which I believe that, when used responsibly, will unlock infinite creative potential in the next generation of music creators.
Our large label music partners told us a story about a CEO that came to see them (probably a service you have heard of) where they are clearly in violation of training on IP-protected works just to speedrun a product to market. They asked them how they had trained and he said, “We would rather ask for forgiveness than permission”. This kind of thinking and irresponsibility could destroy music. Full stop. We must rally against this kind of unbelievable irresponsibility in the development community.
Image: Lucy Transeau
Okay, so now the good stuff. For artists that have a large corpus of work, through a lot of new laws, litigation and the music industry sticking together, they will have a completely new ancillary revenue stream unlocked. This will be through allowing consensual training on their work and fractionalised revenue share model for different types of tasks, engineering, patch making, and things that are real friction points in the music creation process that take us out of “flow” while we create.
I’m thrilled about tools that will fairly reward artists they are trained on that unlock brand-new possibilities for young and seasoned artists alike.
Finally, the tools we are building, and things I see people building behind closed doors, are completely mind-blowing. There are unimaginable future technologies that all artists, singers, producers, and engineers will love to use because they do groundbreaking things they currently don’t have access to.
So my measured TL;DR answer is: The future is bright and we must as a community (musicians) proactively control and be involved in the narrative of what is acceptable, what is ethical and all couched in reverence and respect for the large bodies of work created that are needed to train on for effective new tools. There is a lot to look forward to.
What are your favourite examples of AI-integrated plugins?
There are none available that have impressed me aside from the obvious things in software like RX (dereverb). The things coming, though — WOW!
Image: Lucy Transeau
Check out what else BT is up to at btmusic.com.
The post BT goes deep on his plugins: “There isn’t a producer on the planet that won’t want this” appeared first on MusicTech.

Producer BT talks about his iZotope plugin collabs, how AI is impacting music production, and how he uses plugins in his setup.