Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top

Top

No Comments

Machine Theatre

  • On February 14, 2018
  • https://bit.ly/3Exim2H

Seen above is my favourite installation art by roboticist Arthur Ganson.

A chair rests peacefully centre stage. A seemingly cumbersome machine enters. It moves towards the chair, slowly but gracefully, almost floating. The machine swoops ups the chair. It performs a skilful lifting movement, making the chair rotate and levitate in mid-air. It then puts the chair back down. The whole process lasts all but a minute. Simple but honest.

It was theatre.

It was Machine Theatre.

────────────────────────────────────

I am glad to have found out about this concept now, especially during the early development of my thesis work. Theatre art has always fascinated me and it is where the majority of my training comes from. Now combined with robotic studies during the MArch, I am confident that unique concepts will emerge in my designs as a result.

My current research interest involves examining new relationships between the machine and human outside of the traditional master-servant arrangement.

In the past, restricted machine intelligence and insufficient human-machine interaction often placed machines into less autonomous roles such as that of an assistant, an appliance, or a servant (Dautenhahn, K. et al. 2005). Now with the emergence of technologies like deep learning, we have hereby entered an era that requires more imagination. With sufficient training, the machine has rapidly increased its ability to better predict patterns and to acquire more sophisticated features. These skills are seen useful in tasks such as image recognition and speech generation. Creative usage of these tools, overlaid with suitable narratives, might serve to generate impressive artistic visions that were not possible previously.

────────────────────────────────────

Preliminary Proposal:

I hope to examine contemporary ways of collecting data for AI generation, ultimately forming a commentary upon the fact that at the root of AI is human nature. In light of this, we might reconsider popular concepts like technological singularity. A number of things could cause more immediate harm than artificial super-intelligence, such as biased data resulting unfairness, or malicious use potential crime.

I propose to conduct a social experiment utilising contemporary AI technologies, learning from the human behaviour of first impression. I would like to collect portrait photographs and invite public participants to anonymously caption them with labels they see fit. Once accumulated enough data, I will produce from portraits paired with top used labels a training set to be periodically fed into a convolutional neural network for feature learning. With sufficient training time and fine-tuning iterations, the neural net should then be capable of producing predictive labels for portraits it captures in real-time.

I will then apply this AI ability to a piece of Machine Theatre. This installation will be a one-room composition that accommodates multiple viewers but engages with only one volunteer at a time. To initiate an interaction, a viewer would have to take seat centre of the room for the performance to unfold. The AI will now present them, to the best of its knowledge, with a predictive label that suits its first impression. Since this information is essentially predetermined by human-created labels that existed before, the viewer will face the judgment of, not an AI, but a human collective.

For the current development of the hardware, I propose sand drawing as the machine gesture. I find the transient nature of sand supplies well for the type of interaction I’m aiming for. The mechanical design of an XY plotter is referenced for its seamless movement and precision of position.

────────────────────────────────────

Goals for the week:

  • Complete the hardware construction of an XY plotter (Fri & Sat)
  • Implement grbl onto an Arduino and control the plotter in real-time (Sat)
  • Design and build the tilt-camera stand and the sandbox (Mon & Tues)
  • Generate gcode and experiment with writing on sand (Mon & Tues)
  • Finish first draft of design drawings in Rhino (Tues)

────────────────────────────────────

Progress:

────────────────────────────────────

References:

Dautenhahn, K., Woods, S., Kaouri, C., Walters, M.L., Koay, K.L. & Werry, I. 2005. “What is a robot companion – friend, assistant or butler?,” 2005 IEEE/RSJ. International Conference on Intelligent Robots and Systems, pp. 1192-1197.

Submit a Comment