This performance will explore how humans and robots interact when improvising music. The musical robots include PAM (a robotic string instrument made by EMMI) and a robotic percussion ensemble (made by the Music, Perception and Robotics lab at WPI) that are controlled by an artificial intelligence designed and programmed by HCL sponsored artist Scott Barton.The human performers include Alex Temple (voice, synths), Ammie Brod (viola) and Matt Orenstein (double bass). By situating these performers in various improvisatory contexts, the music will invite both the performers and the audience to ask questions about how we listen to and make music: how do we interpret ideas such as rhythms and melodies from a stream of individual sounds? Once we grasp a musical idea, what do we do next? Do we imitate it? Do we transform it? Do we do something totally new? How can machines interpret and produce musical ideas in unique ways compared to their human counterparts? How do human performers react to such expressions? The interactions between the groups will illuminate the musical territory that is uniquely human, that is uniquely mechanical, and that is shared between the two. The charting of this territory presents possibilities for new and interesting kinds of music.