Chris Randall: Musician, Writer, User Interface Designer, Inventor, Photographer, Complainer. Not necessarily in that order.
Tags: Disembodied Hands
October 6, 2018
by Chris Randall
I'm trying like hell to get back to a weekly AI, but circumstances dictated otherwise for #11. I would have only been a day late, because I didn't have the correct flavor of Arduino to do MIDI-over-USB and I didn't discover this until I tried to code it; easily solved by Amazon Prime, but then we inexplicably had a hurricane here in Phoenix (I know, right?) and as a result, my same-day delivery of a new Arduino Leonardo took 4 days to arrive.
All's well that ends well, though. Above is the fruits of hurricane-induced tardiness, the Magic Robotic Contact-Mic Hitter Thingie. Essentially, there are four cantilevered hammers I made on my Shapeoko that are struck by solenoids; the solenoids are driven via MIDI. In the video, I have the Arduino rigged as a class-compliant MIDI interface, and it is being driven by Bram Bos' Euclidian sequencer plugin, which is part of his Rozetta suite of MIDI plugins for iOS.
After building it, I see where it could be improved somewhat, but on the whole it does what I intended, and today or tomorrow I'll make some Music™ with it and see how it sits. Watch for that video shortly. Do any of you use electro-kinetic instrumentation like this in your music? If so, what are your thoughts on use cases and such?
September 22, 2018
by Chris Randall
Finally, a new Tech Time! In this episode, I make the Amazing Stethophone, using a $25 stethoscope from Amazon, a cheap lavalier mic, and $5 worth of shit from the plumbing aisle at Lowe's. I did this video with only having a notion that it would work, although I did think about it quite a bit first, so you're actually watching my experimentation process. Whether this is fun or not is up to you.
In any case, further testing shows it works quite well. It is good for getting machinery noises out of the inside of small mechanical things, and it really picks up the resonance in wood well. (Like, stuff that is just a thump with a piezo mic is a nice ringing tone with this.) I think some refinement is in order, but it is a useful addition to the toolkit.
August 12, 2018
by Chris Randall
One of my favorite things on YouTube is the Rhythm Roulette videos on the Mass Appeal channel; I generally love process videos anyhow, and the particular strictures of the Rhythm Roulette format really push my buttons. If you're not familiar, the tl;dr version is that a hip-hop producer goes to a record store, puts on a blindfold, picks three records at random, and then has to make a beat from what they got. It's worth noting that they hardly ever use the records they get for the percussion parts; usually, it's Rando Trap Drums from PetaBytes O' Beatz 43, and I give a big point deduction for that. The ability to squeeze a drum kit out of any given piece of music should be mandatory.
So, for my own Poppin' Tags, I made that a rule: nothing in the song that wasn't originally on the three records. And my other rule that is a departure from Rhythm Roulette is that it needs to be a full song at the end of the day, not just a beat.
Anyhow, with all that out of the way, the above video is the result. I actually filmed the entire process, but it was almost 3 hours long, and that was silly; who wants to sit and watch me edit samples for 3 hours? So I just cut out little bits here and there, and the track (along with the other four, to make a full 5-song EP), in its finished form, will be released on Bandcamp when they're all done.
In further news, there is now an Analog Industries merch store... Not much in it right now, but as things progress, it'll get full up. Check out the offerings.
June 2, 2018
by Chris Randall
Yeah, it's been a minute since an AI video, but we're gonna get back to that now. Readers may remember a series of experiments I did back in 2011/2012 with touchscreen-based control paradigms (here, here, here, and here, with some absolutely stellar discussions about usability in the comments.) Those were admittedly somewhat early days for the entire concept; the iPad had only been out a couple months when I started those experiments, and the idea of an app-based control paradigm was a fairly new thing.
Fast forward to 2018, and shit has progressed a bit, and people are generally used to using touchscreens for control. The reason for the video above isn't really about experimenting with the control paradigm, since that's pretty well-trod territory by now. I'm coming at things from a different angle. I've used a monome for years now, and I have a Max4Live step sequencer for that platform I've written that is pretty much only useful for me, and that I'm very happy with. However, I was using it last weekend, and I got to thinking that it would be dope if I could record control gestures along with the beats. Obviously, the monome itself is kind of shit for that sort of thing, so I first "ported" the control logic for the monome to a JUCE app, so I could run it full screen on a touchscreen monitor. When I did this, I was able to break out all the unlabeled control buttons to dedicated buttons, and improve the pattern memory and such.
After that, I gave each lane a four-bar gesture recorder; there are three gestures in all, and the X, Y, and Z planes can be assigned to any parameter in Live. (In the quick demo above, I generally have them going to effects sends and suchlike.) The sequence memory and control is hosted in the M4L patch, but the gesture recording and playback is hosted in the JUCE app. Note this is running on a separate computer entirely from the one hosting the Live session. (It is, in point of fact, that little Intel NUC, stuck to the back of the monitor with double-stick tape. It is communicating with M4L on the host computer via OSC over my home wi-fi network.)
There are actually 10 lanes of gesture recording; in the video above, if you look closely, you can see them labeled D1 - D6 (the drum lanes), Bass, and S1 - S3. I don't actually use the non-drum ones in the video, but they're there and working. There are 8 banks of 8 patterns, and each pattern has its own gesture memory.
I could easily add more buttons to where I could control the session entirely from the app, and not have a Push2 there, but there's no sense re-inventing the wheel. The purpose of the experiment was to proof-of-concept fast, intuitive real-time control of a Live session from a separate computer's touchscreen, and I'm pretty happy with things so far. The next step is to try to put together a whole song (or several songs) to perform; this isn't great for writing as it requires a lot of prior preparation. But for performing, I think there's a lot of potential to be explored.
Side note: the two synth pads I play towards the end are both Quanta. That fairly major undertaking is reaching its final stages, and the synth is perfectly usable in a session now. So yay for that!
February 3, 2017
by Chris Randall
The second in my Sound Experiments series for the Analog Industries YouTube channel. In this one, I'm using various tape sources in lieu of my normal tools. (Although, it's worth noting, all these decks are part of my normal toolset.)
The speech, running on the small Marantz deck, is from an interview with John Cage where he is talking about a Glenn Branca performance he'd seen the night before. He calls Branca a fascist. I guess if you're John Cage it's fine to bag on Glenn Branca. I don't think I could personally get away with it.
Otherwise, I just made loops from various sources and dumped them to the decks, and perform with the decks' various controls for doing so. Of special note is the pad. I recorded two channels on the 4-track with two different chords of the same sound. I'm running the tape cue out of the 4-track to the Eventide H9 Max, set to an edited Black Hole preset, and I'm "playing" the chord by cross-fading the two send controls. This gives the nice big stereo effect.
I made the drum loop on my Eurorack, and then dumped it to Frasier, my MTR-12. I cut the one-measure loop directly on Frasier, and I'm playing it back in Edit/Dump mode (you can see me hit the dump button every time I stop the deck). The reason for this is that if the deck isn't in Edit/Dump mode, the right hand capstan, which is supposed to keep tension on the reel, spins at a ludicrous speed. In Dump, that capstan stops.
This brings up an interesting point, though: when working with tape decks in a musical context (as instruments, rather than as playback/recording vehicles) each deck has its own personality. The dump button on Frasier is a good example. On the Marantz deck that has the Cage interview, it has a pitch knob (it is a dictation deck IRL) so I was able to pitch Cage's voice, which is whiney at the best of times, down to a more reasonable listening experience. That sort of thing. Every deck has its own little tricks and features that you need to explore and exploit. This is mildly fun. "What can I do with this?" That sort of thing.