Dysphasia is a critical design that reminds us of what is missing from computer-mediated remote interaction. The word “Dysphasia” originally refers to a disorder of language caused by a brain lesion; in here, it is borrowed to describe an unfamiliar discomfort when we rely on the mediated communication as the sole source to stay connected during the lockdown. We lose track of the atmosphere, the movement and the focus shift that were taken for granted in person-to-person interactions; thus, the information can be hard to decipher.
Our project involves non-verbal bodily communication in a remote mediated mixed-reality interaction. On one end is a web app that “teleports” you to another reality on the planet, on the other is a physical installation that makes your virtual visits perceivable to local people in a transient, ghostly manner. In this way, we build a bridge between strangers in the private/isolated space and public urban realm, and in the meantime, amplify the feeling of disconnectedness in these days.
Responsive and Interactive
At the early stage of this project, we were researching on the live streaming videos of public space. Although IP camera has always been linked to surveillance, accessible live channels on Youtube and EarthCam etc. certainly comfort those who miss the familiar places.
For example, the Shibuya Crossing live video attracts hundreds of people and the realtime chat remains heated. These scenes are inherently about reality rather than a virtual world. They offer to us what is missed during the pandemic stay-in. However, people viewing these realtime images are only observers. What if they can participate in the scenes?
Hugely inspired by McDonald’s project Exhausting a Crowd, in which web users can leave comments on realtime live videos of famous landmarks, we adopt AI-trained model PoseNet to capture your body joints with simply a built-in webcam on the laptop. Then you can see yourself as an abstract ghostly shape “teleported” to a remote place in the real world. Please feel free to engage in another reality here: https://dysphasia.appspot.com/shibuya
After establishing the first step of the project, we start to question: is it possible that we are not only reacting to certain scenes, but also exert real influence on the remote reality we have been responding to? Hoping to create a live feedback loop in our project and make the web users not only virtual observer but visitor, we brainstormed a series of possibilities.
Two promising directions hence identified were Augmented Reality environment and an interactive installation onsite. The former envisions a mobile app in which one can see the ghostly avatar — representing web users from the other end of the computer — on the street through AR interface on the phone.
However, compared to the installation proposal, the AR one relies too much on the technical realisation and fails to portray a poetic metaphor of contemporary disconnectedness that we aim for. We decided to dive further into the installation idea.
Up till now, we have been testing out the first iteration of the machine: a small unit of misting device. In the coming months, we will endeavour to amplify the impact of the feedback loop in our project.
Featured Lab Project
Featured Lab Project
For those of you that have been following our project, you might notice that our project name has changed. It is now called Vital Morphons. How did we end up with this new name? Rewind back to ...