Artificial Intelligence and the Future of Music

Matthew Montfort Recording on Scalloped Fretboard Guitar

Could A.I. Take Away the Magic of Making Music?

By Matthew Montfort

I was a serious science fiction fan when I began my study of the guitar with an eye towards becoming a professional instrumentalist and composer at the age of 12. I remember wondering if all of my guitar practice would be a smart investment as humans evolved. Back then, the future of human evolution with computers was conceived as one where our body size would shrink, and our brains would become bigger. I imagined a future in which I could put on a thinking cap and the notes would pour forth, powered by brain waves and electronics.  But I decided that even if that happened, there would still be interest in "old school" instruments like the electric guitar as an appreciation of art, and in any case my mind would be all the more ready to use the new technology.

So I plowed ahead with my music studies, concentrating on rock, Western classical, and jazz. I was interested in creating new music by fusing different forms of music. Being an idealist, I wanted to make the music that humanity would need if humanity were going to survive. Why stop at fusions of Western forms? I wanted to make music that showed how ideas from different cultures could work together. I called this cross-cultural music world fusion music.

I moved to California to study at the Ali Akbar College of Music, where my world fusion band Ancient Future was born. At that time, it was difficult to find musicians who could play the music I wrote with all of the complicated rhythms from around the world. So for my M.A. Thesis, I wrote a training manual entitled "Ancient Traditions – Future Possibilities: Rhythmic Training Through the Traditions of Africa, Bali, and India." The book has three chapters on traditional music, and a "Future Possibilities" chapter urging the reader to experiment with fusing traditions. The idea was to facilitate the creation of new music by bringing the knowledge from the world's great rhythmic traditions to more musicians.

After releasing two acoustic world music albums with my band Ancient Future, I became an early adopter of digital music technology, embracing MIDI recording as a powerful tool to test composition ideas. I believed in replacing those MIDI tracks with musicians performing on instruments on the final recording, but I was excited by the ability to hear my ideas on an approximation of the instrument I was writing for.

But as I saw the effects of the use of this technology, I started to see hidden downsides. Looping and samples have colored music so that humans create what the tools make easy to create, rather than what is in their minds. The technology was coloring the music, and not always for the better.

A reference to my book "Ancient Traditions – Future Possibilities" recently showed up in an email to me from an academic search engine. When I checked to see what that was all about, I found that the Balinese chapter of my book had been cited in a 2005 academic paper by Godfried Toussaint, who was using the ancient Euclidean algorithm to generate traditional rhythms. To apply the algorithm to rhythm, the greatest common divisor of two numbers is used to generate the number of beats and silences in a rhythm where the beats are as equidistant as possible. According to the paper, this algorithm can generate most of the important traditional world music rhythms with the exception of certain Indian rhythms.  

The YouTube video below demonstrates the use of the Euclidean algorithm in electronic music. Rather than coming first from the composer's mind, this music is the result of turning dials on a sequencer that is set up to generate Euclidean rhythms automatically. This puts the machine at the forefront of the creation of music. This is the exact opposite of my intention when I wrote my book, which was to place the deep source material from the world's greatest traditions into the minds of musicians so that they could use that knowledge to express their emotions. Here, the musician lets the machine create the music while playing with the dials. This is not human expression in the truest sense. It's goofing around with a machine. This is not at all what I had in mind when I wrote my book.

Just a few weeks before learning of the citation of my book in the research for the creation of the Euclidean algorithm, I ran across a quote from David Cope, who was the Professor of Composition at the University of California, Santa Cruz. He experienced writer's block midway in his career, and developed a computer program that used an algorithm to study the music he had composed in the past to suggest new ideas in his style. It turned out that the algorithm could be used to analyze any composer's music and create new music in that style, and he spent much of his later career working on Experiments in Musical Intelligence (E.M.I.) software.

David Cope was quoted in recent articles because of new programs based on his work that can create film and video scores automatically, putting human composers out of business. For example, Amper has been marketed as an A.I. music creation program for content creators. Jukedeck is a company that creates and sells computer-generated music, charging as little as 99 cents a track for a small business. Jukedeck enables users to choose the length of a piece of music, its style, the instrumentation, and climactic moments. It is possible that these new composition programs will receive input from the traditional rhythms in my book. But I wrote the book in order to train musicians, not computers. The idea was to increase human expression possibilities while showing how ideas from different cultures can work together, not to take humans out of the equation all together.

After the release of Jukedeck, David Cope, the Professor of Composition who taught so many composers their craft, unapologetically stated in an interview on All Tech Considered on NPR that those who write soundtracks and jingles for a living should look for another job: "It's going to go that way eventually," he says. "It may be 20 years from now, it may be 50 years from now, it may be two years from now. But, no matter when it is, it's going to happen. Period." All this because he wanted to find a way around writer's block.

So how do David Cope's composition students feel about this turn of events? I asked Mariah Parker, a composer and multi-instrumentalist with two releases on Ancient-Future.Com Records who studied composition with David Cope at UC Santa Cruz in the 80's around the time he was developing his composing program. "I don't recall David Cope being overly focused on his work with A.I. in his classes, and I was much more interested in his own compositions, including his work on 'The Way,' a piece inspired by his love of Canyon de Chelley and Anasazi art which involved the construction of unique instruments. At the time I had mixed feelings about E.M.I., as it felt like something of a dry and soulless academic endeavor. Looking back, his seminal role in the evolution of A.I. and music is undeniable, but it certainly raises fundamental questions about what constitutes creativity and soul. It also raises the question where do we go from here and how do we make intelligent use of emerging technologies. If we value human creativity and genuine artistry, we need to be wise with technology. If we farm out more and more of the role of music composition to A.I. creations such as David Cope's E.M.I and his subsequent 'Emily Howell,' we impact musicians' ability to support themselves."

While David Cope created his composition program to aid in composing classical music, no genre is off limits. In December 2017, "Hello World," the first pop album created with the help of artificial intelligence was released. To me, the use of A.I. in the creation of this album shows just how formulaic pop music already is. Making pop music more even formulaic with the help of computers would only escalate a sad trend.

Instead of using technology to enable our minds to freely create music and hear it instantly (as I imagined as a 12 year old Star Trek fan), we are going in the direction of eliminating the human element altogether.

I maintain that artificial intelligence should absolutely not be used to replace the human element in the creation of music. Doing so misses the entire point of the art of music, which is to communicate deep human emotions. If we use A.I., it should be to enable humans to spend more time creating art by automating menial tasks.

Indeed, in the information age, human body size seems to be growing while brains atrophy, the opposite of what I imagined the future would be like as a teenager. Now we are at a point that humans have access to very powerful technology, but lack the tools to determine how to use that technology responsibly.

Humans have more technology than they can use appropriately. Before new technologies are released into the wild, humanity needs to at least have a conversation to decide if the proposed use is desirable. We need regulations to enable that discussion, because just allowing the "free market" to decide after tech companies "move fast and break things" creates a recipe for disaster.

Ancient Future band leader and scalloped fretboard guitar pioneer Matthew Montfort has 40 years of experience in every aspect of the music business. He received widespread media coverage for his role as a class representative for independent musicians in the Napster court case. He is currently involved with the Artist Rights Alliance (formerly the Content Creators Coalition), a dedicated group of artists, creators, and stakeholders who are forming a coalition that will allow the people who create the content that powers the web to join together and exercise their collective voice. He has recorded with legendary world music figures such as tabla master Zakir Hussain. He has performed concerts worldwide, including the Festival Internacional de la Guitarra on the golden coast of Spain near Barcelona, the Mumbai Festival at the Gateway of India in Bombay, and on national shows such as the Rachel Maddow Show on MSNBC. He is recognized as one of the world's 100 Greatest Acoustic Guitarists by DigitalDreamDoor.com , a curated "best of" site, along with such luminaries as Michael Hedges, Leo Kottke, Chet Atkins, John Fahey, Adrian Legg, Merle Travis, John Renbourn, Tommy Emmanuel, and Doc Watson.

 

Secure-Subscribe™  
email newsletter signup with email marketing privacy protection