Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Researchers Develop an AI Musician

DZone's Guide to

Researchers Develop an AI Musician

Shimon is literally a music machine who runs on deep learning to hit all the right notes. Learn about the research behind him!

· AI Zone
Free Resource

Find out how AI-Fueled APIs from Neura can make interesting products more exciting and engaging. 

A couple of years ago, researchers from the U.S. Defense Department developed a robotic jazz player. The machine was trained on vast scores of music to gain a sense of what notes typically follow others, such that it could create reasonable tunes of its own accord.

It’s a theme that’s been developed by researchers from the Georgia Institute of Technology, who have developed AI capable of composing passable tunes. As before, the algorithm was fed on around 5,000 different songs, which collectively contained around 2 million motifs, riffs, and licks of music.

The musician, called Shimon, created compositions roughly 30 seconds in length (which you can listen to at the bottom of this post).

“Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,” the researchers say. “Shimon’s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments.”

Hitting the Right Note

The team reveals how Shimon is capable of playing chords and harmonies. Indeed, it’s capable of focusing less on the next note and more on the complete structure of the music. It’s very similar to how humans listen to music.

“When we play or listen to music, we don’t think about the next note and only that next note,” they say. “An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole.”

By using deep learning, the team believes Shimon is capable of creating higher-quality music with greater structure and coherent composition. Suffice to say, it is still in its early days, but they believe that Shimon — and machines like it — could eventually create music that is genuinely novel and innovative.

Shimon will be conducting a live performance at the Aspen Ideas Festival and is the latest work to come from the Georgia Tech laboratory. Previously, they have developed things such as robotic prosthesis for drummers, which was attached to the shoulder of the drummer and was capable of responding to the movements of the player.

Whilst it’s perhaps a long way from purely automated composition, it is nonetheless a clear sign of the progress being made in the field, and it will be fascinating to see what innovations emerge from both the Georgia Tech laboratory and others working in this field.

Check out Shimon in the video below.

To find out how AI-Fueled APIs can increase engagement and retention, download Six Ways to Boost Engagement for Your IoT Device or App with AI today.

Topics:
ai ,machine learning ,algorithms ,music ,deep learning

Published at DZone with permission of Adi Gaskell, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}