Counterproductive is a robotic installation about the consequences of algorithmic bias in artificial intelligence. The work is presented through the format of short film. There is an increasing dependence on AI in the daily lives of human beings. As people co-inhabit the world with these technologies, they need to understand how their algorithms work and to what extent they are able to affect their lives. The film investigates the algorithm as a human creation, questioning its perceived neutrality. Biases in algorithms take several forms and manifest in different ways. The work focuses on biases in training and classification data, how this data is transferred to specific contexts, and the consequences specifically on gender and gender identity.
The research found an increasing dependency on algorithms which sort and classify people. This classification is assumed to be objective. The resulting film is a metaphorical extrapolation of the consequences of this assumption, when living with AI-powered devices has become normalised. The production involved designing and building the devices which serve as characters in the film, and which ‘act’ in real-time during the film. The film was written and produced with a full crew during challenging times of the Covid-19 lockdown. Further, the documentation and interviews with participants serve as a companion film about contemporary gender relations.
Research and Theory
All props for the film were designed and manufactured at the Interactive Architecture Lab, The Bartlett, UCL at Here East.
Overview (Lamp, Kettle, Toaster)
The lamp was trained using a machine learning model that could recognise how wide the user opened their hands as a gesture to turn lights on and then built upon that to see what a future smart lamp potentially could do. The idea of masculine surveillance was explored through the lamp.
“What if the lamp’s gaze could follow you everywhere you went?”
Eventually while shooting, the lamp was also connected to a controller that would allow the film crew to manually control it, behind the scenes as the actor interacted with it in front of the camera.
The kettle was trained as a model to recognise when faces were smiling and when they were not based on track points on their lips. This information was then mapped as visual feedback to show the kettle’s willingness to boil water as and when someone would smile at it.
The toaster was a manifestation of what an interaction would look like if a user had visual feedback on their voice. Based on interview research, it was concluded that this is something that women face on a daily basis and defines how seriously people take them. The toaster only toasts bread if the user’s voice is under a certain decibel level.
The film Counterproductive was shot in 4 days over 5 locations during the November 2020 COVID-19 lockdown.
- Onuoha, M. (2017). Classification.01. Retrieved February 17, 2021, from https://mimionuoha.com/classification01
- Dunne, A., & Raby, F. (2007). TECHNOLOGICAL DREAMS SERIES: NO.1, ROBOTS, 2007. Retrieved February 17, 2021, from http://dunneandraby.co.uk/content/projects/10/0#
- Buolamwini, J. (2018). Joy Buolamwini Research. Retrieved February 17, 2021, from https://www.poetofcode.com/research
- Mccarthy, L. (2019). LAUREN. Retrieved February 17, 2021, from https://lauren-mccarthy.com/LAUREN
- Mccarthy, L. (2019). Smarter home / ? ??? ?. Retrieved February 17, 2021, from https://lauren-mccarthy.com/Smarter-Home
- Phan, T. (2017). The Materiality of the Digital and the Gendered Voice of Siri. Transformations (14443775), (29).
- Danks, D., & London, A. J. (2017, August). Algorithmic Bias in Autonomous Systems. In IJCAI (pp. 4691-4697).
- Lewis, S. (2019, April 25). The racial bias built into photography. Retrieved February 17, 2021, from https://www.nytimes.com/2019/04/25/lens/sarah-lewis-racial-bias-photography.html