Making The Palimpsest
In the spring of 2016, we learned that our studio space and large portions of its neighborhood would be demolished for the construction of the High Speed Rail 2 (HS2) project. After hearing about the controversy surrounding the community engagement practices for the project, we researched how emerging technologies can improve participatory design.
The Palimpsest is our proposal to improve participatory urbanism. The project utilizes 3D scanning and virtual reality for two purposes. The first is to allow communities to capture 3D recordings of themselves, their community, and the architecture that is important to them in order to preserve it. It also becomes a tool for sharing their stories during debates on the project. Second, The Palimpsest acts as a neutral platform where all of the parties involved in the project, from communities to government bodies to corporations, to put forward ideas and discuss them. In this sense, people can view government proposals in virtual reality, at a 1:1 scale, and provide feedback. More details can be found on our project page, found here.
This article discuss the making of The Palimpsest in detail. The following video is an overview of the technology and processes we used to generate content and the virtual reality experience. Following the video, we will present experiments, precursors, and further detail into our process.
In order to produce content for the VR experience, we deliberately used the most affordable and accessible technology available to us. This choice is rooted in our belief that, ultimately, technology should be used to increase access and understanding of planning processes, rather than widen the gulf between those able to produce persuasive media and those who are intended to merely receive it.
The least accessible element of our current proposal are the LiDAR scans of St James Gardens and Drummond Street. The equipment is prohibitively expensive, and we were privileged to be able to work with ScanLAB to produce our base scans. However, advances in Project Tango are proving to be an effective alternative to LiDAR scans. As the software develops, we are confident that the project could be produced with scanners available on cell-phones as early as this fall. The process of LiDAR scanning is shown in the video below. To see a brief view of Tango scanning, please view the first video in this article.
A video showing the process of using a LiDAR scanner to capture St James Gardens.
In order to capture 3D videos, we used the Microsoft Kinect and software designed by Brekel (found here) to film people explaining how they or others will be impacted by the project. Each frame of the video is a 3D model. By playing these models in sequence, and loading them with a spatialized sound engine, viewers can walk around the interviews as they play, viewing and listening to them from any angle they choose. We coupled these interviews with the architectural or urban context being discussed by the interview. In some instances, we are inside an apartment that will be lost; in others, we see the trains arriving that someone believes will greatly improve their business. Please see the first video in this article for a demonstration of interview recordings.
Navigation is a difficult and important dimension of the project. From a practical standpoint, it is necessary to avoid creating feelings of nausea. In order to achieve this, we utilize an instant teleportation system for the Oculus Rift experience. This reduces the amount of time that the brain’s vestibular system is out of sync with visual movement in the experience.
However, we are currently developing a version of the experience that will function on the Project Tango. This version will allow people to navigate the different interviews by walking around at a 1:1 scale. Ultimately, this is how we envision the project being experienced. People would be able to stumble across the virtual content in public space.
Our process began with a series of experiments exploring the potentials and limitations of emerging technologies. We were particularly interested in working with Google’s Project Tango, a device which allows two important shifts in virtual reality (VR) and augmented reality (AR) experiences. The first is that it is a 3D scanner, meaning that it can be used to generate 3D models of objects and spaces in real-time. Secondly, it can track its rotation and displacement, allowing for it to register exactly where it is in space without any external sensors or computing.
Another technology that intrigued us was binaural audio. Simply put, binaural audio records sounds through two microphones that are the same distance apart as human ears. This results in recordings that sound so real that they can, at times, be indistinguishable from real sounds. It is also possible to replicate this effect by producing 3D models and digitally reproducing how sounds would fall across a pair of ears. Using this technique of spatialized sound, we explored how we could create immersive audio experiences.
The following experiments are some of the initial projects we completed to explore these technologies:
Using Google’s Project Tango, we scanned King’s College Chapel in Cambridge to produce an atmospheric model of the space. We then recording ambient noises throughout the chapel and placed them within the digital model. Then, using Unity, we using the Tango to load a 1:1 model of the church into the Bartlett School of Architecture studio space. Participants could walk around the studio space and peer into the chapel and hear its sounds using the tablet.
Intrigued by the potential for participatory design, we developed an app which allowed participants to draw in 3D. After letting the users experiment with the tablets, we asked them to draw architectural improvements to a windowless space.
Using a pair of low-cost binaural microphones, we recorded a dull conversation within a lecture space. During the conversation, increasingly loud and abrupt sounds were produced in the space. We then invited people to listen to the recording. Distracted by the recording, the sounds of moving chairs and tables sounded real to the participants, which resulted in what some described as ‘the feeling of ghosts’.
By producing a pure sine wave sound in a space and recording it with binaural microphones, we are able to subtract the sound itself and are left with a digital model of the reverb in the space. Using this convolution reverb model, we can alter prerecorded sounds to make the sound as if they are being played in the space.
Going 360 was our first experiment using 360 film. We recording scenes throughout London and, in particular, the Barbican, using both 360 film and binaural sound. The experience can be viewed in 3D with a cell phone using Google’s Cardboard headset.
Traces of Reality: Roundhouse
At the midpoint of our research process, we were invited to create a virtual reality installation for the We Are Now Festival at the Roundhouse, London.
Traces of Reality: Roundhouse is a surreal augmented reality installation merging past, present and future in a social virtual reality experience. Installed as part of a performance arts festival in the Roundhouse, London, viewers could explore an interactive virtual environment with another person. We designed a custom headset for Google’s experimental Project Tango that enabled participants to walk around the virtual space at a 1:1 scale.
While inside, viewers can see geometric abstractions of their surroundings generated in real-time. Fragmented memories from the building’s past are introduced, including 3D 1960s psychedelic art from its use as an arts venue, floating barrels of gin from its use as a warehouse, and a full-scale steam engine from its original purpose as a repair station. The environment is interactive and includes spatialized 3D sound.
From a design research standpoint, Traces of Reality had two primary objectives. The first was to explore strategies for merging real-time sensing of the participants’ surroundings with virtual content. The second was to enable participants to experience the space socially through two networked headsets. Details of the installation can be found in each of the articles linked in the following section about our research. The following video documents the installation:
A video showing the installation of Traces of Reality: Roundhouse.
A video showing the manipulation of spatialized and binaural sound in the Roundhouse installation in greater detail.
This project was featured by the Culture File podcast of RTE, Ireland’s National Broadcast Radio. Listen to the interview below:
In parallel with our explorations of the technology, our team conducted in-depth research on key concepts related to our proposal. There are three focuses: the social and philosophical justification for participatory design through emerging technologies; the importance of social virtual environments for generating ‘presence’, or the sensation of real-ness, in virtual reality; and the role of sound in connecting people to cities and each other. To read the full articles, click on the title links above the summaries that follow:
A diagram showing types of burdens imposed by urban development schemes and the appropriate responses, based on writings by Wolff and De-Shalit (Wolff, Jonathan, and Avner De-Shalit. 2007. Disadvantage).
This paper examines how emerging technology in virtual reality and 3D scanning can be used to facilitate participatory design. Using the High Speed Rail 2 (HS2) project in the UK as a case study, the paper analyzes the need for more inclusive community engagement in urban development projects. Virtual reality is examined as a more inclusive medium for architectural representation, as research has shown that it is more readily comprehended by non-designers. Furthermore, the advent of low-cost virtual reality equipment like Google Cardboard and Project Tango have made it a viable technology for participatory design schemes. The paper uses a design research project based on the concept of a digital palimpsest as a tool to investigate how stakeholders in the HS2 project can communicate needs, solutions, and facilitate debate in a participatory design process. Our research suggests that virtual reality can become a tool for inclusive community engagement if the strategy is designed to be intuitive and accessible.
A diagram showing how networking software was used to enable multiple participants to view each other in the same virtual environment.
Presence is crucial for participants to feel a sense of being in a virtual environment. Enhancing presence as a psychological phenomenon can offer participants a special subjective experience, rather than the purely observational experience like that of television or film. Presence is composed of three dimensions: (i) social presence relates to a sense of “being there”; (ii) personal presence connects to feeling of “being there with others”; and (iii) environmental presence presents the existence of virtual space. This project will develop ‘presence’ by using multiplayer avatars and free walking in a 3D scanned environment. The components of this project allow participants to exist with others in virtual environments and experience virtual reality with others. This report will argue that multiplayer virtual experiences can provide participants with stronger feelings of being present in virtual space than single-player experiences, and that the existence of other players can emphasize one’s presence in virtual environments.
A video showing how 3D reconstructions of the virtual space are used to create artificial reverb that matches the space being experienced by the participant.
Virtual and augmented reality has become more and more accessible over the last couple of years with the rise of mobile virtual technology such as the Google Cardboard and the Gear VR.
The rise of the new virtual reality platforms has also created a new interest in spatial binaural sound. As David Beer points out listening to virtual sound through headphones can often become an isolating experience (Beer, 2007), but can these experiences go from being isolating to become social through the use of Virtual Reality? As researchers Barry Blesser and Linda Salter point out it is difficult to have long-term memories of soundscapes (Blesser and Salter, 2009). Can Virtual reality become our collective memory of our cities, and can it through this memory help us to decide what our future cities will sound like?
The Palimpsest shows that emerging digital technologies are well suited for participatory design. While the platform we propose could become an integrated part of urban development practices, what is more important is that designers continually examine how new tools can lead to more equitable and meaningful community engagement. We are open to ideas, feedback, and collaboration that moves toward making planning processes more accessible. Technology can be an excellent tool for achieving this, but does not have any inherent value that justifies its use–tools must carefully and deliberately used in order to achieve goals.