Robots have typically struggled with the kind of creative tasks that separate man from machine. There are signs that things are changing however. Earlier this year, researchers at Imperial University showcased a robot that was capable of painting.
The device uses software to track our eye movements and move a robotic arm accordingly. Suffice to say, the device is not capable of painting independently, and is therefore more an aid to artists rather than a replacement for them.
Robotic jazz man
The same may not be true of a robot being developed by the US Defense Department. They are developing a machine that is capable of performing a trumpet solo after picking up cues from fellow (human) musicians.
“The goal of our research is to build a computer system and then hook it up to robots that can play instruments, and can play with human musicians in ways that we recognize as improvisational and adaptive,” the researchers say.
Being able to improvise as jazz players do is incredibly difficult, and is something that has thus far been far beyond the capabilities of artificial intelligence systems.
To improv effectively, the machine will have to be capable of synthesizing a huge amount of data from the people around it, and do so in near real-time.
“We’re getting lots of video of musicians playing in front of a green screen together,” the researchers say. “We’re going to build a database of musical transcription: every Miles Davis solo and every Louis Armstrong solo we’re going to hand-curate. We’re going to develop machine learning techniques to analyze these solos and find deeper relationships between the notes and the harmonies, and that will inform the system – that’ll be the knowledge base.”
The system will also contain a microphone that picks up the music being played around it. This input is compared against a huge catalog of jazz solos to try and understand what should happen next.
It’s noticeable that many of the automated systems we currently utilize are all rather hidden away from view, or alternatively operating in industrial settings where looks don’t matter.
When the human-robot relationship is slightly more personal however, these things matter. The researchers believe, for instance, that the sound is already good enough to fool most listeners, providing they don’t know it comes from a machine.
A team from the University of Lincoln suggested recently that robots would be more effective if they had some rather human flaws programmed into them, and maybe that is something to consider.
Suffice to say however, whilst the approach still has some evident flaws and many areas to improve upon, those improvements are likely to be made in pretty quick time.
“The way that musicians learn how to play jazz, they mimic the style and the impressions – learning the style is sort of the first way into the music – and they have a knowledge base of solos they can play backwards and forwards,” the team say.
“It’s a stored warehouse of information,” they continue. “What creative jazz players do over time is synthesize this database of everything they know from all of their heroes – Charlie Parker and John Coltrane and so on – and they’re playing something that comes from a tradition but synthesizes all these different expressions. There’s a process there that I think we could actually model. At what point does this thing stop sounding like Miles Davis and start producing something that sounds new?
“When we hear that something has emotion, there may be a sort of ineffable quality that a musician has on the stage. I think the more interesting question is what the gap looks like between a machine being creative and a human being creative.”