When thinking of arts, words like tradition, culture, inspiration, feelings and talent come into our mind, and are automatically linked to the human condition. For Human Computer Interaction (HCI) researchers, performing arts translates into just another means to learn, evolve, express oneself and… communicate with an artificially intelligent entity, either a robot or a software agent.
Musical Improvising Collaborative Agent (MUSICA)
The Defense Advanced Research Projects Agency, an American research body with over 50 years tradition of investments in breakthrough technologies and with the main scope to anticipate and meet new challenges, has recently launched the Communicating with Computers (CwC) program, which takes the communication task to the next level, as the main scope is to share complex ideas in collaborative contexts. One of the most difficult use cases of the CwC program targets collaborative composition, aiming to have humans and machines collaborate together towards the assembly of a creative product.
The MUSICA project, which is part of the CwC program and implemented by the School of Art and Design of the University of Illinois, is dedicated to designing a computer system that will communicate with humans through jazz improvisations. Ben Grosser, a professor in the School of Arts and Design said of the MUSICA project:
“It’s approaching this communication from a different perspective than linguistically, and maybe we can learn something new about computers and humans. We proposed that music improvisation is a novel space for investigating communication because it’s not just cognitive interaction. It’s also an embodied experience. You feel the music as much as hear it and think about it. You react.”
To build a software system that might be able to improvise music, Grosser and his team will create a knowledge database of canonical jazz solos. Their approach is to computationally analyze these solos using "image schema," a way that people understand their world using spatial concepts. For example, a musician might play "inside," meaning their notes fit within what the song's chord changes suggest, or "outside," meaning they're pushing the boundaries of what is appropriate. He’ll then add a performance system that will analyze what a human performer is playing in real time, including the beat, pitch, harmony and rhythm, and consider what it has learned about jazz solos to communicate musically in response.
What to Expect from the Jazz Playing Robot in the near Future
The MUSICA project builds on previous research of Grosser and his team, which includes an interactive robotic painting machine. The jazz robot answers in an unconventional way to the symmetric communication problem between humans and computers, but the research is highly affected by various parameters: the understanding of the currently played music, the planning and creation of what should come next, the timely response of the system, etc. Grosser hopes to have a system that will be able to perform a sophisticated “call and answer” musical response in a year, with the ultimate goal of system answer being perceived as musical communication by a human performer.
Although not really targeted to a highly talented jazz robot, the project adds new dimensions to what we think of arts and to the design of artificially intelligent systems of the future.
Top image: Jazz Robot by Italian robotics company Teotronica is a robot that plays the piano using two hands and 19 fingers. Source: (laughingsquid)
No comment