It's possible I went a little overboard in writing this week's Discovery News piece about uPlaya.com, the Web site that lets musicians upload a track for quick evaluation by music-savvy algorithms. The whole thing deals with AI's designed to judge a song's hit potential, and it goes a little into algorithmic musical composition as well. So I thought, "Who better to give a musician's perspective on the whole deal than former programmer and overall Internet music sensation Jonathan Coulton?"
Jonathan's currently in the midst of a tour, but he was kind enough to chat with me for a few minutes and share his thoughts on technology, creativity and the prospect of algorithms composing and evaluating music.
Only a fraction of the interview made it into the published article, so I thought I'd share the complete interview here for any JoCo fans out there. If you're not already familiar with Coulton, experience his blend of folk music and geek culture in the clip below ("I Crush Everything," my personal JoCo favorite). Some of you might also know him through his work with John Hodgman, Valve Software or Rifftrax. Or read Tracy's blog post about him.
ROBERT LAMB: You have a programming background. Does any of that translate over, technically, into the creative process for you?*
JONATHAN COULTON: Yeah, I think it does in a sort of general way and a specific way. I think having a technical brain colors the way I approach songwriting. I very much think of it in terms of construction. The way I think of how to solve a problem when I'm writing code feels like the same part of the brain that thinks about writing a song. And in a more specific and direct way, I'm a guy who records everything at home, and there's so much stuff that's just me by myself. Having a technical brain, I can spend a happy couple of hours making sure all the drivers are updated. Not everybody enjoys that sort of work, but when you run your own home studio that's what you spend a lot of time doing.
As a singer-songwriter, how do you feel about -- let's face it -- robots deciding what a hit song is? What do you see as the limits of this approach to music making?
Well, you know, I am a fan of the future. I am a fan of robots and artificial intelligence. I believe that at some point we're going to have a machine intelligence that's just as capable as a human at picking out hits -- and maybe just a little bit better. That said, in my experience most humans aren't good at picking out hits. When I was writing a song a week during that whole "Thing a Week" year, every Friday I would make a prediction: "This one is going to hit it big" or "This one nobody's going to like," and I was ALWAYS wrong. And that's a guy with a niche audience and direct contact with them, you know?
So, I mean that's one issue. Another issue is, you know, I'm not even sure I know what it means for a song to be a hit anymore. The very definition is changing and maybe going away. I'm not even sure it's that important for musicians in general to have hit songs anymore. There's a very small percentage of musicians who depend on having hits, but there are a lot of musicians who get by without having anything you can clearly a call a "hit." I wouldn't say that I've had a hit. I don't get a lot of airplay at all -- barely any. I have some songs that are more popular than others among my fans, but my model doesn't depend on having songs that are hits as much as it depends on having a lot of songs.
In observing your fans' appreciation of your music, have you observed any predictable patterns? Anything that makes you think, "A good algorithm would really streamline this whole process?"
(laughs) Man, I wish that was true. You know, the creative process is just so messy, and the way that people appreciate music is messy, too. I guess you can create an algorithm that looks at the music and the physical properties of the audio waves that are created when that music is played. You can look at past hits and see what kind of trends there are, but that's just such a small part of the picture. It has so much more to do with timelessness and subject matter and attitude and all that other extraneous stuff that you can't really predict. So no, I've never identified anything in my creative process that has gone the same way more than once. (laughs) Algorithms never came up.
MIS [Musical Intelligence Solutions] CEO David Meredith frames his company's work as providing a shorter path between independent artists and the public. As an independent artist who has used the Internet to circumvent the established music industry, what do you think of the possibilities here?
Well, I'm certainly in favor of anything that helps connect musicians and fans, and I'd have to know more details on exactly what they mean by that. I think there's a danger on the creative side in thinking too much about what you're doing and whether it's going to please people. The stuff that always works best for me is the stuff that's honest and true and personal. If you write from a place of wanting to please the most people, I don't know if that's the most satisfying way to approach a creative career -- especially if you're an independent artist and just getting started.
But you know, honestly, more and more we're seeing the need for new kinds of filters. We used to have radios to tell us what to listen to, but that doesn't work so well anymore. And music has become -- culture itself has become so fragmented and compartmentalized that it's more and more important for us to have ways of finding music that we like. To the extent that a technology like this can steer fans toward artists and vice versa, I think that's a good thing and follows along with the trends of sites like Pandora and Last.fm and the Apple Genius engine that say, "If you like this, you might also like this."
Some people overreact a bit about computers getting involved in our music -- even fretting over the possibility of computers composing music as if it hasn't been happening since the mid-1950s. What are your thoughts on a possible future where AI's compose more and more of our music?
You know, I think it's going to happen, and I don't think it's a bad thing at all. Probably it will be pretty great -- or it will be as bad as any human composer. I think that, again, once you cross that threshold and have a machine intelligence that basically functions like a human intelligence, what's the difference? I mean, the answer is in the question. By definition it's a humanlike intelligence. So I don't think there's any reason I wouldn't enjoy something written by a machine, if it were a good song.
I've recently become very interested in all sorts of electronic gizmos and gadgets and composition and performance tools because you can only do so much with a guitar. And I love to play the guitar. I love to listen to the guitar, but there's really something satisfying about putting it down and picking up a ridiculous piece of equipment with a lot of buttons that's going to make a lot of noise and also inject a lot of chaos and randomness into what happens. The Zendrum in particular, when I play that, it's always a little bit different. That's because I make mistakes and some of the buttons go off by themselves, but you can feel the audience getting sort of excited when that happens. That's what live performances are about: that process by which you accidentally find something awesome. So for me that's what I love about those devices and that's what I love about technology and music: the potential to sort of shake things up and bring you to places you wouldn't otherwise get to.
Any other music tech you'd like to get your hands on? Are you familiar with the reactable?
Yes, absolutely. I know exactly what you're talking about -- and true, I covet that. And I've been reading up a lot about the Monome, which is a grid of buttons that light up and send messages to a computer. But you watch people play and it looks like a magic trick. I covet those skills. I don't know what they're doing or how they're doing it but it sounds awesome. And they clearly have a mastery of it. It's just thrilling to watch.
The last couple of weeks I've been reading about the Eigenharp. It looks like a bassoon, and it has a thousand buttons on it. It costs like 6 or 7 grand, so I'm not going to be buying one anytime soon, but part of the manifesto behind it is that electronic instruments can and should be taken seriously in terms of something you really need to learn how to play and something you can get very good at playing. So all the buttons are touch sensitive to these waves, and it's designed to be a very expressive kind of electronic instrument, which I think is not always something you get with electronic music.
And there you have it: my interview with Jonathan Coulton.
* For a non-technical example, see JoCo's song "Code Monkey."