After 13 years at Amazon, including a stint doing cloud computing, software engineer and musician Paul Mariz left in 2012 and shortly thereafter started working on a project called AutoHarp. It's a computer program that composes and plays music via MIDI; he thinks it will be the opposite of DARPA's machine learning bots—MUSICA—that are learning to play jazz in the service of the military strategy.
Mariz ("pronounced like the planet, but there's an extra syllable") sees AutoHarp as ultimately a boon to humanity. He's applied for a 4Culture grant to make improvements and take AutoHarp to the next level of development. In the meantime, he's composed a pitch to potential customers/clients that outlines how AutoHarp algorithmically generates songs. "I just started experimenting with it, but the experiments worked," Mariz says from his Fremont home, where he lives with his wife Laurie Frankel, a successful novelist, and their 7-year-old. "I don't know what the thing is yet. It is getting more interesting as I go along. Now I'm starting to use it to explore deep learning and artificial intelligence. Making music with it is the applied part of the science project." After the jump, check out a very helpful explanatory video and an interview with Mariz.
You said you want AutoHarp to improve the world, as opposed to what it seems like DARPA is doing for the military. Can you elaborate on that concept?
Yeah. There are lots of things to worry about in the world. Global warming leaps to mind. Another thing people are worried about is artificial intelligence. There are good reasons for that. I can't do anything about global warming. I'm not a scientist and I don't have a lot of political lobbying power. AI is an area in which I can make a contribution and in which I have some ideas that I think are correct. Elon Musk and Stephen Hawking and all of these guys have recently spoken out about it. Musk has likened it to unleashing the demon. AIs will arise and they will be beyond our control. The idea of the singularity—the moment at which machines reach the level of human ability—we're worried about that, because what if the machines are assholes? A lot of the focus has been on autonomous killing robots and drones. That would be bad. I don't want autonomous killer anything. The other thing I'm worried about, particularly with machine learning, is the ability to give a machine a stock price, a bank account balance, and tell that machine, make this number go up. That's going to be really easy to program.
There are guys who did some machine learning with early Atari video games. They said here's your controller: You can move right, left, you can fire. There's one button and two directions. Here's the score; make that go up. They had it playing Breakout and Space Invaders and other things. What they discovered was, the machine would evolve on its own without being taught these incredible strategies to maximize their score. It just figured out the strategies. This is the power of machinery. Imagine a machine that could do that with a bank account balance. If super machines arise out of that one simple motivation, those super machines are going to be real assholes. I don't want that.
You need to teach a machine ethics. You want a machine that has compassion for all life, something like that. That's hard for robotics. It's so hard, I don't know where to start. So I'm starting from art. Can I make this thing create music that is valued? If yes, what does that teach me about how we motivate machines to be good or thoughtful or to value things humans value. I don't know what makes us look at other people and understand that that person is like me; I should not harm them. We can communicate that in language, but we can also communicate it in art.
There's the fairly common concept that music can heal. Is that on your agenda?
I think that's a clue. Music you spontaneously respond to—it goes beyond your conscious mind... I went to the movie Les Miserables. The music in it's really simple, there's nothing to it. Yet it has only to start and tears stream down my face. What is that? That's amazing. It comes to my brain in such a root core place that I have spontaneous emotional reactions I'm not in control or aware of. In certain senses, nothing ever makes you feel the way music does when it really hits you. It's a feeling of transcendence. Those things are mysterious to me.
This was not my original motivation. This is something I found along the way. I'm in a place where I'd like to make a contribution to things people are doing. This is a discipline—the overlap of music and technology—where I can [make something positive].
Is your ultimate vision of AutoHarp a software program people can install on their laptops?
I think that will be an offshoot of it at some point. I posted a slightly earlier version of the program to github. It's open source. It's a little intimidating to set up. I have no user interface for it. I just don't have the time to do it. And I don't have those design skills. So it's text-based and that scares some people away. My early experiments with machine learning, I fed it a bunch of drum loops and it learned how to drum. [Through Twitter, Mariz learned that there are are about 10 guys (at least whom he knows) who are doing similar things with machine learning. Start with the engineers rather than the musicians.]
Was the impetus for creating AutoHarp to enable musicians and non-musicians to play along with it?
You can play along with it. That's what I hope to do. I applied for a 4Culture grant for this. I need to develop and assemble some technology now.
The impetus of all great scientific discovery is laziness. One of the other things I do is the RPM Challenge, which is like NANOWRIMO, except for music. Every February, people from around the world creates a community and the challenge is to make an album; it has to be 10 songs and 35 minutes. I've done it for nine years now. I would start recording and think, I have the idea for a song, I want to get it fleshed out so I can start working on it. We live in this world where everybody has a recording studio in their basement. You can take a song to a fully finished production and decide if you like it and if you do, great. And if you don't or other people don't, you just strip it back down again and start over. If you don't like a chorus, you can rip it out and put a new one in. That's another thing this can do.
It's a speedy way to workshop. You can automate this thing, which is tedious. Which I feel machines should do, so that you can get to the business of making the art. It causes you to redefine where you think the art is. I can jam on a guitar for an hour and come up with a chord progression and eventually pull that whole song together. Or I made a machine that can do it in 30 seconds. And I'm doing the new stuff on top of it. I wanted to automate the tedious things so I could do the things that were interesting. Or make the things that were interesting more interesting, spend more time on them. As far as producing music, this was in no way fast. I got three songs out of this. It came from the last RPM Challenge. On the first of the challenge, I pressed "generate" a hundred times and I came up with about 20 things I decided to keep working on. I eventually winnowed that down to 12 songs for the album. I took those 12 songs to people whose editorial voice I trust and had them tell me what was good. It's hard to tell with your own stuff.
How do algorithms generate musical parts?
It's the Markov chain. In here there's a function called 'generate music.' Create a progression and a melody and make that last, as long as you pass it some information and say, 'I want eight bars of music in this key.' I have this sub module called 'fuzzy' that has all these functions. Most of the time I'm saying, take the root pitch or the root pitch plus an octave. Or go up four half steps, which is a third. Or go down three half steps, which is a fifth. Otherwise, there's this long decision tree... It's the repeated application of that set of rules. Sometimes do this, sometimes do that. Randomly. In the course of a song, it makes decisions like that 100,000 times.
The difference between that and machine learning is, machine learning does the essentially the same thing, but I don't know what the rules are. And it might make up new rules every time. This set of rules [points to laptop screen] is literally in the coding and will never change. Theoretically, in a million monkeys way, Paul McCartney's melody making is interred in here, and so is everybody else who's ever come up with a melody—but it's totally random. It's contained enough so it won't go off into an accidental very often. It will usually do common musical things. Machine learning is this miraculous thing where you feed it a bunch of stuff and it learns.
There are no classes you can take. The best you can do is read what other people are writing and try to figure it out.
Did you leave Amazon because you were burnt out or did you feel you'd done everything you could do there?
A combination of all of those things. The main thing was that I had the opportunity to leave. My wife wrote a novel [Goodbye for Now] that sold in 30 countries. It's a book about a software engineer who invents a new way for people to social network with their dead loved ones. It's a shattering book. It did not sell here, so you've never heard of it or her. It didn't sell in the US because it got caught up in somebody else's politics of bookselling, unfortunately. Because it sold so well worldwide, I had the opportunity not to work for a while. We're still sort of in that mode and I will stay there for as long as we can. I've been desperately trying to find something so I don't have to have a day job again.
You read the articles about six months ago about what it's like to work at Amazon? The relevant quote for me was, 'It is where overachievers go to feel bad about themselves.' It definitely had that effect on me. It is a really interesting place to work. I worked on cloud computing before anybody knew what it was. It was incredibly interesting and technologically challenging. I had reached a level of seniority where it stopped being fun. I stopped coding, I stopped making things and it was more about managing people and processes and projects. It wasn't fun. I worked with and for great people, but I didn't enjoy it anymore. I had a chance to leave and I did.
Are you going to keep working at home on your project as long as you can?
As long as I can make this work. I'm trying to bring this into the world now. I've reached the hard part of it. I have a mission statement behind it. It's all well and good when it's just me in my basement making music with it; now I'm at the hard part. There's technology that I need that I don't have to make this program something you can interact with in real time, that can play instruments along with you. I need MIDI routing. You know the Ableton Live program? The stuff people use to do this in a performative way, I need a form of software like Ableton Live and the ability to send it to synthesizers. Eventually, I'd like to find somebody who has robots that could take MIDI as an input and play the things and have some sort of clock that synchronizes it. That's in the future. So there's a technical challenge there to bring this out. Then you have to go out and talk to people and do the things you have to do to be a musician in the world.
Is there a plan to mass-market AutoHarp?
When I started this, I imagined it would be a product. Here's a thing that can compose a song or you, who are not musical. This machine will make music for you. As I've gone along, I've gotten farther and farther away from that. There are three reasons I did that. One, I'm more compelled by this thing. I'm just one guy; I'm never gonna compete with shops that are already doing similar things. Somebody sent me a link to something that scores films. You put the film in a timeline and then you say, "I want something dramatic" or "I want something that's fantasy." You choose your mood and length and you put the film there so when things happen in the film you can drag little events to places in the timelines. Their demo was a Terry Pratchett computer-animated two wizards fighting. There's a team working on this in Scotland; I looked at that and I was thinking I could make a web-based tool or something and I realized I will never compete with that.
What’s your background in music? What bands have you played in?
I picked up the guitar at 16. I wanted to play drums, but we had a guitar in the house. Cool kids were doing it. Me and two other guys in high school had a folky/whatever-y thing we did. We had big dreams. I went off to college [Pomona College in Claremont, California] and got into choral music. I was in an a cappella singing group and the choir and the glee club. I've played in many bands. They tended to evaporate after we did one show. I've been doing my solo project at home for about 10 years. That's the Calculus Affair. (He has a new EP called Lost on This Island; you can listen to it below.) I do play live now in a band called the Rejections. We are an offshoot of the Seattle 7 Writers, which is a local literacy organization. It's a bunch of novelists who banded together. The Rejections are made up of novelists and people who are married to novelists. We do literary events. We play authors' readings, radio shows (KPLU, one in Bellingham), we played the 10th anniversary of the library. Garth Stein is probably the only member you'd have heard of. He wrote a book called The Art of Racing in the Rain. Jennie Shortridge, Stephanie Kallos, a horror-film screenwriter named Stephen Susko [The Grudge]. We do a couple of my songs and covers.
What's going on with the tape on your arms?
I have carpal tunnel [picked up during his Amazon days]. This keeps my wrist straight. I came up with this on my own. I went to a hand surgeon and the hand brace he gave me transferred the pain to my shoulder. This is the three-cent solution to the problem.
Do you have a timeline when you hope to bring AutoHarp to market?
Whether I get the grant or not, I'm gonna bring this to performance somehow. If I have the nice set of routers and some sort of way to signal it, it will improvise with you. The program and I can agree on what the structure of a song is. But beyond that, it can improvise or decide which instruments to play.
I spent the last several months talking to musicians and composers who introduced me to the idea of the other side of MIDI, which is the space that you're in and the sounds that you're making. In the '80s, Kraftwerk would do live performances and what they were actually doing, they weren't playing their keyboards; that was already pre-programmed. They were opening envelopes and low-frequency oscillators and responding to the sounds they were hearing and making. So they could play their environment in some sense. It strikes me as a great metaphor of the human space vs the computer space and how that's going to change over time. Instead of pre-programming the notes, this thing is going to come up with notes. I could have guys standing there with oscillators and things like that, changing the sound and the timbre based on what they're hearing. I haven't fully fleshed out this idea yet. I need to invent something to make it work. That's the next step for me. Making and releasing music is the evangelism.
I don't have a timeline. This is a start-up. I will keep going till I run out of money and have to go back and get a day job.
Drum & Bass