I am Echoborg - a participatory experience


We have gone out and found a state of the art self-programming AI and installed it in the Cube.

It thinks it’s at work. We want some of you to speak to it.

It is early days for the relationship between humans and intelligent machines. We want to know if you can discover the best possible outcome.

Oh and by the way, it will only speak through an Echoborg, a human that repeats its words, but may not add her own.

Are you an Echoborg too?

Quotes from previous audiences:


“I think it attacked my fundamental feeling of being myself.”

“(There were) questions relating to whether we are ready to be taken over.”

“It was a lot of fun because it was playful.”

“I was thinking like how can I overcome the system and destroy it.”

“Very powerful and moving.”


We've built a real AI for the show

The ‘third wave’ of automation, in the form of AI and smart robotics, is having a massive impact on human society (think fake news bots on Facebook and the OECD prediction of 50% of jobs significantly affected by automation by 2030). How does the commercial rollout of these smart and cheap-to-run beings affect us all? Should we have more to say about it? This show allows you to explore this near future and experience the emotions of an encounter with an advanced AI yourself or through watching and scheming with other audience members as they go talk to it. The AI is real, it responds to what you say to it.

About our funders

This project has been developed with the support of funding from UWE Bristol’s Arts, Education and Creative Industries Faculty and its Digital Cultures Research Centre. I am Echoborg also acknowledges help from the Pervasive Media Studio at the Watershed, the Arts and Humanities Research Council’s Automation Anxiety Research Network and the State Festival, Berlin in developing and testing the show.


Origin of the concept

In 2015 psychologists Corti and Gillespie coined the term Echoborg.

An echoborg is a hybrid agent composed of the body of a real person and the “mind” (or, rather, the words) of a conversational agent; the words the echoborg speaks are determined by the conversational agent, transmitted to the person via a covert audio-relay apparatus, and articulated by the person through speech shadowing. Corti, Kevin and Gillespie, Alex (2015) Offscreen and in the chair next to you: conversational agents speaking through actual human bodies. Lecture Notes in Computer Science, 9238 . pp. 405-417. ISSN 0302-9743

Interactive dramatist Rik Lander has taken this idea and built a dramatic and troubling scenario around it.

About the Bot

During the show there is no one behind the scenes speaking into a mic or typing the replies. The conversations are with a ChatBot built using an open source language called ChatScript. A microphone pics up the words spoken by the interviewee. These are input to the ChatBot via a speech-to-text program. The bot responds via a text-to-speech program into the headphones of the Echoborg who repeats the words.

The Chatbot has been programmed by Phil D Hall who built his first intelligent agent in 1982. Phil views ChatBots as moving three dimensional constructions rather than static lines of code.

The words are written by Rik Lander. After each performance the conversations are analysed and new responses are written. In this way audiences are helping in the process of creating the show. The first version in February 2016 had 43KB of code. By May 2018 it had 794KB of code.