Watch These Music Robots Play Miles Davis-Inspired Jazz

If you’ve ever wondered what robots playing (and dancing to) a Mile Davis-Inspired composition would look and sound like, then you’re in for a treat.  

The past few years there’s been significant advances in terms of AI (artificial intelligence) in the music technology world. Take a look at WaveDNA’s Liquid Rhythm and even Logic Pro X’s Drummer. But Georgia Tech Ph.D student, Mason Bretan, has been imagining up something altogether more tangible. There are two types of music theory intelligent robots in this video. Shimon listens to and generates sounds, and three Shimi's generate dance routines based on the music being played and can play their own music too.

How technology like this will impact our music production software in the future is literally mind-blowing... imagine real-time accompaniment in [your DAW here] whilst jamming out ideas... 

Here’s the video followed by the full description of the project from Georgia Tech College:


Jan 16, 2015 | Atlanta, GA–A Georgia Tech student has surrounded himself with a team of dancing robots and an improvising, marimba-playing bot to collaborate on an original, Miles Davis-inspired composition. Mason Bretan, a Ph.D. candidate in music technology, plays the drums, guitar and keyboard. A robot named Shimon listens to the sounds, then generates music on a marimba using its computational knowledge of jazz theory and improvisation. At the same time, a trio of Shimi robots autonomously generates dance choreographies based on a joint analysis of the music and a self-awareness of their physical constraints and abilities. The Shimis also play their own complementing music, based on a combination of Bretan's original compositions and improvisational algorithms. The six-minute, high-energy funk piece is called “What You Say” and is based on Davis’ “What I Say.” It’s the latest project from the lab of Gil Weinberg, Bretan’s advisor and director of Georgia Tech’s Center for Music Technology.

Shimi robots.

Shimi robot.

Bretan created the composition after listening to Davis’ 1971 Live Evil album.

Bretan created the composition after listening to Davis’ 1971 Live Evil album.

“The brilliance of the musicians on that album is an inspiration to me and my own musical and instrumental aspirations,” said Bretan. “They also set the standard for the level of musicianship that I hope machines will one day achieve. And through the power of artificial intelligence, signal processing and engineering, I firmly believe it is possible for machines to be artistic, creative and inspirational.”

The project was created during a span of several months at Georgia Tech. The Shimi robots analyze the music offline and generate a sequence of movements and musical phrases that can then be performed live. Shimon, is given the chord progression prior to the performance, then figures out how to improvise with Bretan. The student will spend the next few months fine-tuning the process to allow real-time analysis and composing. 

Source: Georgia Tech College

Rounik is the Executive Editor for Ask.Audio & the macProVideo Hub. As an Apple Certified Trainer for Logic (and a self-confessed Mac fanatic) he's taught teachers, professional musicians and hobbyists how to get the best out of Apple's creative software. He has been a visiting lecturer at Bath Spa University's Teacher training pro... Read More

Discussion

Want to join the discussion?

Create an account or login to get started!