A.I. bots just dropped a black metal album that will make your head explode

We tend to think of robots as suitable for repetitious and mundane mechanical tasks, and that creative endeavors like art or music are things only humans can create. Well, think again: With advancements in artificial intelligence, computers are now capable of making music that is difficult to discern whether it’s created by man or machine. And the latest example of this is Coditany of Timeness, a black metal album made entirely by an artificial neural network.

As first reported by The Outline, Coditany of Timeness was created by using deep learning software that, over a short period, was trained to analyze and reproduce the style of music based on what the scientists feed it. In this case, it was an album by the black metal band, Krallice. This isn’t the first experiment in A.I. music creation, but the project is particular noteworthy because, unlike previous experiments that replicated classical music, black metal is “characterized by its ultra-long progressive sections, textural rhythms, deep screams, and melodic weaving over a grid of steady, aggressive rhythmic attacks,” and has “extreme characteristics [that] make it an outlier in human music,” wrote the project’s creators, Zack Zukowski and CJ Carr, who go by the name, Dadabots.

Related Videos

In short: music that isn’t easy to recreate, yet the computer was able to make something that sounds like it came from the band. According to The Outline’s author, Jon Christian, “If I didn’t know it was generated by an algorithm, I’m not sure I’d be able to tell the difference.”

Here’s how the album was created. The Krallice album, Diotima, was first separated into 3,200 eight-second segments of raw audio data. Most of the audio was used for training the algorithm what the music sounded like, but other segments were used to test the software by making it guess what came next. Successful guesses would strengthen the A.I.’s neural network, which operates similarly to the human brain. If the training merely produced unintelligible noise, it was restarted. After three days and millions of repetitions, they ended up with 20 sequences, each one four minutes in length.

“Early in its training, the kinds of sounds it produces are very noisy and grotesque and textural,” Carr told The Outline. “As it improves its training, you start hearing elements of the original music it was trained on come through more and more.”

As if that weren’t enough, the names of the songs and the title of the album were generated by a probability equation known as a Markov Chain. Even the album cover artwork was created by an A.I program.

This project is just the latest in A.I. research around music training. Researchers at the the Birmingham City University in the U.K. are developing a neural network project that could predict what a piece of music might sound like if it had been created by an earlier artist, say Pink Floyd covering a Jay-Z tune. Sony’s Computer Science Laboratory division created two Beatles-esque pop songs after its A.I. project learned various musical styles from a massive database. And the A.I. from Google’s Magenta team created a 90-second musical piece all by itself, thanks to machine learning.

Coditany of Timeness is Dadabots’ first album (you can listen to the albums on Bandcamp), and the results will be included in their research paper, “Generating Black Metal and Math Rock: Beyond Bach, Beethoven, and Beatles,” which will be presented at the Neural Information Processing Systems conference.

Editors' Recommendations

Plex’s Super Sonic neural A.I. analyzes your music to find hidden connections

Despite the huge influence of streaming music services like Spotify and Apple Music, there are still many folks who spend a lot of time listening to music they've purchased and stored digitally on their computers. For those people, Plex is offering a new way for them to discover musical connections within their own collections. Launching today, Super Sonic is a new feature included with a Plex Pass subscription ($5 per month, $40 per year). Plex says it performs a neural A.I. analysis on all of the tracks in your library and uses that information to build "sonically similar" lists.

One example given is if you happen to own Taylor Swift's two most recent albums. Because they sound really different from many of the artist's previous recordings, you might be surprised to learn that they're sonically similar to music by Lana Del Rey. The system can find sonic matches for artists, albums, and tracks within your music library. Plex claims that because its matching process is based on dozens of data points from each song, as opposed to recommendation engines on music services that are typically based on the listening patterns of their customers, it can reveal matches even among your most obscure indie favorites.

Read more
Nvidia lowers the barrier to entry into A.I. with Fleet Command and LaunchPad

Nvidia is expanding its artificial intelligence (A.I.) offerings as part of its continued effort to "democratize A.I." The company announced two new programs today that can help businesses of any size to train and deploy A.I. models without investing in infrastructure. The first is A.I. LaunchPad, which gives enterprises access to a stack of A.I. infrastructure and software, and the second is Fleet Command, which helps businesses deploy and manage the A.I. models they've trained.

At Computex 2021, Nvidia announced the Base Command platform that allows businesses to train A.I. models on Nvidia's DGX SuperPod supercomputer.  Fleet Command builds on this platform by allowing users to simulate A.I. models and deploy them across edge devices remotely. With an Nvidia-certified system, admins can now control the entire life cycle of A.I. training and edge deployment without the upfront cost.

Read more
IBM’s A.I. Mayflower ship is crossing the Atlantic, and you can watch it live

“Seagulls,” said Andy Stanford-Clark, excitedly. “They’re quite a big obstacle from an image-processing point of view. But, actually, they’re not a threat at all. In fact, you can totally ignore them.”

Stanford-Clark, the chief technology officer for IBM in the U.K. and Ireland, was exuding nervous energy. It was the afternoon before the morning when, at 4 a.m. British Summer Time, IBM’s Mayflower Autonomous Ship — a crewless, fully autonomous trimaran piloted entirely by IBM's A.I., and built by non-profit ocean research company ProMare -- was set to commence its voyage from Plymouth, England. to Cape Cod, Massachusetts. ProMare's vessel for several years, alongside a global consortium of other partners. And now, after countless tests and hundreds of thousands of hours of simulation training, it was about to set sail for real.

Read more