Chris Randall: Musician, Writer, User Interface Designer, Inventor, Photographer, Complainer. Not necessarily in that order.
April 6, 2018
by Chris Randall
Audio Damage will once again have a booth at Superbooth this year, and I'm leaving for Berlin on April 27. That means I have about 3 weeks to finalize everything. The product we're unveiling is in feature-freeze, and it's time to start figuring out how to demonstrate it, as well as the rest of our product line. (This isn't a last-minute thing; I've been thinking about it for some time.)
The entire process is both easier than years past, and more complicated. At the last two Superbooths we attended, we only showed Eurorack, and didn't show our software products at all. This year, it will be the other way around. We have no Eurorack; we're showing our desktop and iOS software. The iOS part is easy; a 12.9" iPad Pro with the usual dongling, job done. For the desktop software, the needs were somewhat odder. The product we'll be unveiling is MPE-capable, so we wanted to have at least one MPE controller at our booth. Since the iOS and desktop versions are identical in that regard, it seemed wise to have one MPE-capable controller for each, and of those, at least one that civilians could play. Roli very generously loaned us a Rise 25 for the booth, so that will attach to the iPad, and my Linnstrument will go as well, for the desktop stuff.
All but a couple of our products are now vector-graphics and HiDpi-capable, so I wanted a 4K monitor to show this off. My own monitor is too big, so I purchased a Dell 23" 4K for the show. Now the trouble starts. You would think it would be a fairly simple matter to sling a computer at this problem and be done with it. You'd be wrong. The software we're unveiling is nearing completion, but it is in no way finished, so it isn't as optimized as we'd like. I don't want to use my main work laptop for a trade show, for obvious reasons, so I pulled out a mid-2013 MacBook Pro, and between pushing a 4K monitor and this fairly heavy-duty plugin, it was screaming for baby Jesus. So, we decided to run Windows instead of OSX for the booth. That being the case, we needed a Windows machine capable of both fairly heavy lifting in the graphics department while shouldering the CPU burden of a DAW. Plus, we have a fairly thin budget and weight considerations. It was a surprisingly tough nut to crack.
After a whole lot of reading of computer gear blogs and fretting, I decided that I had enough time to take a small risk, and I ordered up one of the new Intel Skull Canyon NUCs. As it is a bare-bones system, containing only a CPU/GPU, I also grabbed 16GB of RAM, an NVMe drive, and a key for Windows Home 10 64-bit. All of that showed up this morning.
I'll admit that while I knew intellectually how small the Skull Canyon NUC was, I was unprepared for the reality of it. It is just about the same size as a VHS tape. The power brick is almost as big as the computer itself. Anyhoo, it comes with a pretty good selection of ports (including, improbably, a Thunderbolt 3 port, if you want to run a graphics card in a chassis, or a UAD Apollo or whatever.) It took me about 30 minutes in all to set it up, from opening the boxes to signing in to Windows, after I installed Windows off a USB drive built with Windows Media Creator on my big machine.
We'll be using Bitwig Studio 2 to demonstrate our products at Superbooth (for two reasons: a native understanding of MPE, and a good HiDpi implementation on Windows), so I popped over to the Bitwig site and slammed that on the machine, as well as a few of our plugs and some samples. I've been pressure-testing it for several hours now, and I have to say that, taking its size in to account, this computer is an amazing value. I wouldn't use it as my main workstation, but for situations like this, or for a secondary machine or media center, it is fucking amazeballs. With the caveat that I still have a lot to do before I'll call it Show Ready™, I'm of the opinion that this is a solid machine for the money. Two thumbs tentatively up.
March 29, 2018
by Chris Randall
Long-time readers will know how I generally feel about "alternative MIDI controllers" when they come down the pike. I have two metrics for a new MIDI controller:
• Is it actually better than a piano-style keyboard?
• Would you look like a giant fucking douchebag on stage playing it?
The first one isn't so hard to overcome, I don't think. It can further be divided in to sub-categories that are context-sensitive. Modern music making breeds Jacks-of-all-trades, and if the device has buttons that have notes in some semblance of order, anyone that can push buttons in a rhythmic fashion can make music with one. It's just a matter of learning where the notes are. So then the question is: are the buttons in some sort of order that makes sense to me? Speaking strictly for myself, I've played keyboards and guitar for at least an hour or two nearly every single day of my life since I was in my tweens. (Honestly, I'd think I'd be better at it, but I plateaued somewhere in the mid-80s.) I'll be 50 here in a couple months, and my hands _hurt_ when I play a keyboard for more than a minute or two. So ergonomics are a big consideration for me, whereas they might not be for a 20-year-old who has full command of his or her digits.
The second point is harder to deal with. It's called a "show" and not a "hear" for obvious reasons, and looking like a giant fucking douchebag is going to negatively impact shareholder value, as far as live performance is concerned. The number of swing-and-a-miss controllers I've seen at NAMM, where I don't even bother to get a demo because whatever the device is instantly triggers my "man, I'd look like a douche playing that" sensors... Definitely in the hundreds. Lasers and spheres and light-up rings and any number of other douchey configurations. Of course you can use any one of these things to make music, and time + dedication = virtuosity, but ultimately you want to look at least a little bit cool doing it. I'm old enough to accept that "cool" is a moving target, one I can't necessarily hit any more, but even so...
Anyhow, let's go ahead and get to the point. Starting early last year, people began writing the Audio Damage info line asking for MPE versions of our synths. Despite being deeply entrenched in music tech, I only had a vague notion of what MPE was ("something for alternative controllers or something" was my general understanding.) After I'd received several of these, I knuckled down to learn about the format, and was intrigued enough to drop Roger Linn a line and ask if I could borrow a Linnstrument for experimenting. I know Roger pretty well, having had a booth next to him at many trade shows, and he very kindly sent a Linnstrument 128 to me. The only MPE-capable synths I owned at the time were Madrona Labs Kaivo and Aalto (as well as the Animoog synth for iOS) so I booted them up, figured out how the hell to get Live to pass MPE, and sat down to experiment. For reasons lost to the dark past, I decided to film my very first play-about with the Linnstrument and Aalto.
As you can see, it clicked pretty quickly. After a few days with it, I decided to move my Kontrol 49 off the desk and see how this felt as my main controller. After a couple weeks, the Kontrol 49 went in the Closet Of Forgotten Toys, and I wrote Roger to tell him I'd be buying this one. A couple more weeks, and I'd talked Adam in to buying one too, and now, the product we're unveiling at Superbooth is fully MPE-aware. There's no zealot like a convert.
I took to the Linnstrument pretty quickly because the notes generally follow a guitar layout, so I knew where everything was, and it was only a matter of getting used to the dynamics. Having three axii of control once you've hit the note is remarkably expressive, and since, at the end of the day, MPE is just MIDI Plus, it more or less works with everything, while synths that are designed to take advantage of the format (e.g. the afore-mentioned Kaivo and Aalto) really shine in new and interesting ways. I won't bother giving a technical description of MPE; Reverb has already done a fairly good breakdown of that here, and there's no reason for me to reinvent the wheel. Long story short, picture a pad controller with aftertouch (like the Push 2), and make it so that after you whack the note, you can move your finger on the X or Y axis as well, and send MIDI CCs with that. Then give each note on the synth its own MIDI channel, so you can apply those MIDI CCs to an individual note without modding the entire patch.
There are really only four MPE controllers worth talking about right now, in my opinion. They are the Roli Seaboard series, the afore-mentioned Linnstrument, the Madrona Labs Soundplane, and the Haken Continuum. So basically your choices are "guitar-like" with the Linnstrument and Soundplane, or "keyboard-like" with the Seaboard and Continuum.
The upshot of all this, and my takeaway: I can fit five octaves of extremely expressive MIDI control in a space that is smaller than the typical PC keyboard, and I don't look like an idiot doing it, nor did I have to learn anything new, since I can already play guitar. This is a net win no matter how you math it out. As I hinted, we'll be unveiling an MPE-capable product at Superbooth, and we will have both a Linnstrument and a Seaboard Rise there to try out with it. (We'd have a Soundplane too, except that is somewhat larger, and our booth is small, and Randy will be there anyhow.)
I'd like to hear about your experiences with MPE or alternative controllers, especially playing live. I haven't played the Linnstrument on stage yet, but I'm comfortable enough with it that I would feel pretty confident doing so at this stage.
March 27, 2018
by Chris Randall
Hey, kids! Long time no talk. And there's a reason for that. As my interaction on and with Facebook went up, my interaction with the real internet went down. That "service" -- and I use the term loosely, because they, like Google, are an advertising agency whose entire reason for having their platform at all is to sell ads -- sucks all the air out of the room, to the point where my creativity can't breath, and I'm not doing the things I normally like to do because I'm busy arguing with some fucking dolt in East Nowheresville about constitutional law.
So last week, I deleted my FB account and the Audio Damage and Sister Machine Gun pages I administer. Adam and I had a brief discussion about leaving the AD page up, but the ROI for a Facebook page is not great unless that's your entire means of dealing with your customer base. Also, the hypocrisy of leaving the AD page there when I personally won't use the site for mostly moral reasons was not lost on us. As I told some friends yesterday, we may take a financial hit for losing that promotional avenue, but on the other hand, my quality-of-life immediately increased, in a noticeable fashion, and what is money for if not to improve your quality of life? So net gain all around.
I've had to quit other addictions in the past, and literally as soon as I pressed the "delete account" button (it really is that simple) I felt the hole that I'm used to. But like most people with addictive personalities, I knew it for what it was, and am able to deal with it in the same way I've dealt with others. Weirdly, there's a physical habit associated with it that was somewhat harder to break than the mental one. I would normally open a browser, and blam-blam-blam Twitter, Instagram, Facebook. To have one of those missing is very strange at first. I found myself scrolling to the spot where the bookmark lived fairly frequently for the first couple days, and experienced actual, palpable disappointment when it wasn't there. But this passed quickly under the weight of my self-righteousness (which, I think we can agree, is the most powerful drug of all).
In any event, those of you that follow me on Twitter have no doubt noticed my activity there has increased noticeably, and I believe you can safely expect this site to get back to its previous grandeur, with the caveat that I can't really bag on other companies' gear any more (I know too many people in this business now). So the focus of AI will be creativity and trends in music tech, which were the other two points of the Conjoined Tripod Of Success, at least as far as blogs of this sort go. No, I didn't delete my Instagram account. I am, apparently, something of a hypocrite. One final note: I plan to put my non-music-tech meanderings on Steemit. While that "service" (does that word apply to a blockchain-based system?) is a fucking epic Level 9000 circle jerk when it comes to matters cryptological, it's actually a fairly good analog for a blog, and its theoretical permanence is a desirable trait. So if you want to see me write about the other things that interest me (travel, making shit, gardening, etc.) that's where I'll be doing it.
October 30, 2017
by Chris Randall
Yesterday, I started writing a post about watershed moments in your creative lifestyle, when something (either external or internal) brings a change to how you make music (or, well, whatever it is you make.) Then I thought "meh, that probably doesn't really happen to anyone else, because everyone else seems totally together. I'm just a nut." I highlighted all the text and pressed DEL and got on with my day.
About 20 minutes ago, the above exchange took place on Twitter between Noisetheorem, myself, and DJ Empirical that made me realize that this sort of thing isn't uncommon at all, and in point of fact I'm totally norms.
For better or worse, your external environment greatly affects your creative output. Speaking strictly for myself, spending the last three years boxing and shipping Eurorack, talking about Eurorack, travelling for Eurorack, sleeping in piles of Eurorack, and generally devoting my entire existence to Eurorack, has left me in a creative nadir which was unparalleled in my 30+ year history of making music. Earlier this year, we came to the conclusion that we were devoting too much of the company resources to Euro, and decided to ease off and work on desktop and mobile ideas. Since the nature of the Euro market means the hockey stick is ludicrously strong, without a new Euro product we can't really justify remaking older Euro products. As a result, I got to spend the summer, which in Phoenix is like winter for the rest of the country, doing something I truly enjoy: making user interfaces, and not putting Eurorack modules in boxes.
Since confirmation bias is the name of the game these days, as you're reading this, you're going to only see the Zig Zigler Power Words and run off to say "Chris Randall's an asshole! He hates what I do!" or "GOD DAMN RIGHT, FUCK [insert creation method here]." I can't do anything about that, but let me relate a metaphor:
I was raised by divorced parents about 50/50 in rural Oregon and New York City. The rural Oregon half of my family are, for the most part, gun nuts. I was raised around guns, and am comfortable with their existence and use-cases. I own a gun, and know how to use it. My father (the New York half of my co-parenting lifestyle) was a general contractor, so I was also raised around power tools. I am comfortable with their existence and use-cases. I own power tools, and know how to use them. In my head, a gun is basically just another tool. There are people for whom guns are a religion. I am not one of those people. I do, however, understand the motivations and mentality that lead to worshipping guns, talking about guns, collecting guns, etc, and how guns become a lifestyle and not just another tool in the box.
Anyhow, you get the point. Lots of boxes and M3 screws, creative nadir, tedious metaphor, blah blah blah. Long story short, two things happened:
1. Due to our considered opinion that iOS was finally ready for pro (or at least semi-pro) music-making, we decided to make a run at the mobile side of things, and began porting our desktop products to iOS. This forced me to purchase and become comfortable with a state-of-the-art iOS creation environment.
2. Due to customer requests for MPE versions of our synths, we needed to investigate MPE, something of which I knew very little. After pondering things for a bit, I decided the Linnstrument was probably the best source of MPE data, and since I'm friends with Roger, I dropped him a line to see if I could borrow one of the small ones to develop some test cases.
I got the Linnstrument a couple weeks ago, and the first thing I did, to test how MPE worked, was to plug it in to the first synth in my collection that understands MPE. That happened to be Animoog, the polysynth that Moog made for iOS. I spent a few hours playing with this, and decided that MPE was worth exploring. So I moved the Linnstrument to my big computer and folded it in to my development process. Since I don't have a ton of room on my desk, I moved my normal controller, a Kontrol S49, out of the way. The much smaller Linnstrument sat in its place.
Since it was sitting there anyhow, I ended up using it to try to play melodies when I was testing other shit. And I suddenly found myself puzzling out scales and chords on it, and my testing other shit turned in to making songs. At some point that I can't exactly put my finger on, it clicked and I was able to play it. I'm not going to review the device itself because there are reams written about it. But yesterday morning, my wife pointed out that it was nice to see me making music again. I was like "huh?" And she goes "you haven't actually sat in your office and made a song in like 2 years, dude." That's when all this hit me, and I wrote Roger to tell him I'd be buying the Linnstrument off him.
It isn't, of course, as facile as that. There are other outside stimuli that are affecting things (new hobbies, the weather change, etc.) but putting in the time to get the Linnstrument to ease itself into my methodology was definitely the deciding factor in unwedging my creative block. Let's hear it, AI peoples. Do you have similar unblocking experiences?
August 16, 2017
by Chris Randall
When we started porting our products to JUCE 5, one side effect (and I mean that in the literal sense, as it was in no way a deciding factor to do this) was that I could check a couple boxes in the project manager and poop out a standalone app and AudioUnit V3 for iOS. I was already familiar with the iOS app submission process due to my previous experiences (remember Phaedra?) and Audio Damage already had a paid-up developer account so we could get the Apple signing certificate we need for signing our OS X installers and AAX plugins. So we figured we'd put one up and see where it went.
As it turns out, we sort of lucked in to an empty socket, as only two of our desktop peers (VirSyn and Waldorf) had really taken the platform seriously, and the market was mostly owned, with a couple notable exceptions, by companies building specifically to iOS, who didn't have a lot of experience in making plugins for the much more robust and demanding professional music production market. This is in no way a bad thing, as the market had some very inventive tools that you don't see in the desktop world. But we just happened to stumble in (through no planning of our own) to a situation where our product line, which has heavy competition in desktop DAWs, simply didn't exist in iOS.
Speaking strictly for myself, after I gave up on Phaedra, I kind of set the iPad aside as a music-making tool, and hadn't really thought about it again until late May of this year, when I saw I could build AUv3. I then had to acquire an iPad Pro and go learn what AUv3 was, and after some trial and error, and much crossing of fingers and scratching of backstays later, we shipped Rough Rider (for free) in mid-June. Since then, we've tried to maintain something of a parity with the desktop and iOS platforms. It naturally didn't turn out as easy as I thought it would be, and usually the iOS versions require rather extensive reworking of the user interface because of the space constraints. (The notable exception is Replicant 2, where I decided to use the iOS AUv3 aspect ratio for the desktop one.)
Since we started building them, I've necessarily spent a lot of time testing and poking and prodding, and had to acquire most of the top-shelf iOS audio software. I've made a couple tracks now using only the iPad Pro, and I'm willing to say that, while there are a whole raft of weird little Apple-isms to deal with (better than taming OMS on OS 9, but not as bad as trying to use MIDI in OS 7) in general, I'm confident you can go to the store and buy an iPad Air 2 or Pro 9.7 (my recommended devices for AUv3 hosting right now, as the Pro 10.5 and 12.9 have some nasty RAM allocation bugs), spend about a hundred dollars on software, and have something equivalent to a fully-kitted MPC. Which, as we all know, is a perfectly viable platform for making full tracks.
Can it do everything a blown-out MacBook Pro or Surface Book can do? No, it can not. And Apple spends a lot of time trying to make seamless experiences that hide the machinations we need to know about to get the most out of our machines. Forget about using huge sample libraries, as these devices simply don't have the RAM or horsepower to pull that off. But for scratchpad recording and electronic music production, it's hard to describe the vibe. "Fun," I guess? Not a word I usually use with music-making, which, as a former professional musician, I take seriously and equate to work. New and exciting, definitely, and for creativity that is important.
Anyhow, to make a long story short, yesterday we released Pumphouse, which the video above talks about. Most of the iOS DAWs don't have sidechaining, and love it or hate it, that's an important facet of modern electronic music production. (I don't make EDM at all, and I use it all the time on my pads and basslines.) So we came up with a simple work-around by giving Rough Rider a 16-step sequencer, so you could trigger an envelope to side-chain compress the input in a rhythmic fashion. We had thought this would be an iOS-only release, and had no intention of releasing it for desktops, as that effect is easily attainable elsewhere. However, we wrote the plug as a VST (simply because it is much faster to develop audio software on a desktop than an iOS device) and it would be the work of a day or two to "back-port" it to all our supported platforms. So if this is something you're interested in, let me know in the comments. If you have any questions about iOS music production, I'll be happy to answer them to the best of my knowledge, or point you at the appropriate resource.