With the help of AI, is it possible for humans to create a new musical experience? was our question in 2018, when we started our research on music making and AI. Our hypothesis was simple: A deep learning algorithm sees with different patterns than humans do, so it should be able to come up with new musical genres the human mind has never thought of.

At the time we were unhappy with some well-known musical AI projects that feature AI on albums or use it to mimic well-known artists. Most of those ended up being heavily edited by human producers, or only tiny generated snippets were cherry-picked.
We felt that the creative capabilities of AI were misunderstood or not respected. Denying its intelligence feels like failing to appreciate the independent creation of modern AI — it goes well beyond simple mechanical devices using simple mathematical algorithms. Is AI bound to be a tool driven by capitalist thoughts instead of artistic potential?

In collaboration with Arran Lyon, Mrinalini Luthra and Valentin Vogelmann (at the time, UvA MSc students) we developed a deep-learning algorithm that is both musician, conductor and instrument.
This was easier said than done, because… when do you call something music? And how do you explain to AI what is pleasing to the human ear? When is music original and innovative? To tackle some of these philosophical issues, the team compared music to language, and taught the AI to dissect musical data as if they were sentences, composed out of words and grammar.

This led to the birth of vNine, an ANN (artificial neural network) based deep-learning algorithm that is able to make musical compositions with and without humans. It composes music for 5 instruments in harmony at the same time.

︎︎︎vNine interface designed by Arran Lyon

The possibility to create new musical genres has been a symbolic journey throughout our research. To us, AI is a portal to a future that has yet to be explored and accepted. A deep-learning algorithm creates its own “recognition patterns” based on the data it’s given. It’s based on the human mind, but works in different computational ways.

That means this tool makes it possible to reach places the human brain cannot reach.
We need to cherish this technology and welcome it by understanding that humankind has to give up their self-assigned “top position” in this universe. It is time we move on from the discussion about control.

After vNine’s release, we developed a series of concerts called The Promethean Promise. It was named after Prometheus, a Greek god who stole fire from the Olympus to give to humanity as civilization. He was heavily punished for it, as the other gods believed that making humans more intelligent would only result in the extinction of the gods.

During our decentralized concert, we sacrifice our control and symbolically give our fire to the performing AI. vNine becomes the center point of the performance, and while it’s making music, human performers are present to assist it by adjusting tones and timbres. vNine plays 4 synthesizers, 1 drum computer and 1 light synth simultaneously. In an effort to translate the performance of a software-entity into something tangible, we built large circles of light around the instruments. With every performed note, the circles would light up. In the center of the floor was a projection with vNine’s neural network. Its brain was the centerpiece, with all the humans and the audience revolving around it as asteroids.

︎︎︎Non-human aesthetic value of output of vNine discovered by Valentin Vogelmann

Wanna collaborate, play around with vNine or talk about it? E-mail us.