- Ero Papavassiliou
- On April 20, 2015
The Polymelia Project considers the human body as an assemblage; a collection of heterogeneous components, a material-informational entity whose boundaries undergo continuous construction and reconstruction. We think of the body as the original prosthesis we all learn to manipulate, so that any replacement or extension becomes part of a continuing process of upgrading the human entity. The Polymelia Suit (PolyEyes, PolyLimbs, Exoskeleton, Sensing Suit) suggests a new communication language for the future of prosthesis and of humanity.
“You are alone in the room, except for two Raspberry Pi Camera Module spinning in the dim light. You use PolyEyes (aka Hammerhead Vision System) and through the Raspberry Pi Compute Module, you communicate with some other entities in another room, whom you cannot see. Relying solely on the Exo-skeletical Suit Controller, you must decide whether to share or receive stimuli. One of the entities wants to share its own visual field. The other entity wants to send you signals from its sensing body. He/she/it will reproduce through the PolyLimbs the body movement of the other entity. Your job is to explore alternative ways for communicating that distinguish your current performance from an embodied augmented reality.”
Polymelia [from the Greek = many and part] is actually a birth defect involving more than the usual number of limbs. Currently, the three of us are working on designing an upgraded version for the future of the human body, an exoskeleton, or a prosthetic structure, that will seek to enhance every part of our bodies that we consider possible of improvement.
We have done a series of experiments and small prototypes based on these experiments, taking into consideration our spatial perception, the human behavior, emotion analytics and the body structure in general.
Starting with the spatial perception, we place body and space as equal, as we consider their relationship reciprocal. If we try to define a space through our bodily construction we will realize that this space is keep changing based on our behavior, our bodily, mental, and physical features. At the same time, every space that tends to be unfamiliar at the beginning, concludes being more intimate in a completely personalized way.
This is because our spatial perception is an individual thing, based on our brain, our body structure, and sensory experiences, or combinations of those through the cognitive processes.The initial bodily feature that we decided to examine as part of our future exoskeletical construction, is vision, so we searched for ways to extend our visual perception. Our brain and eye’s structure are responsible for the visual cognitive process. At the same time, be analyzing vision, we also earned the advantage of considering extended realities, as part of our project -i.e. Augmented Realities.
The first step, was to place our prosthesis into the wider range of general prostheses. The following diagram shows the division of prostheses from the inside of the body and our DNA as the initial form of prostheses’, to the outside of the body and the architectural space as the ultimate exemplar of prostheses.
Prosthesis’ meaning depends on the root of the Greek word “prósthesis” [which means addition/ application/ attachment] and it is defined as a supplement or attachment to the human body. It does more than simply extend the body, because it’s introduced in bodies that in Freud terms are somehow “deficient” or “defective”, or in Le Corbusier’s terms “insufficient”. Our prosthesis is at the moment part of the portable and wearable categories.
We have done a series of experiments with bio-metric sensors, primarily to get in touch with the function and the capabilities that these sensors can offer, and then to gain a better understanding of the way the inside of our body responds to stimuli.
We combined an ear-clip heart rate sensor, with a GSR, an EMG detector, and Flex sensors, and get the results for different periods of time, and multiple activities.
The Hugging Jacket is an attempt to create a network between two people that are in different places at a specific moment and feel the need to interact through a hug. The first person presses the touch sensor that is inserted on the jacket, which sends a signal to the second person’s jacket and turns on an LED. Then the second person might recreate the hug movement, and through the flex sensors that are introduced in his gloves, the first jacket got activated. Both jackets are made of soft robotics, so while activated, the air muscles are filled with air, and provide the feeling of an actual hug to the wearer.
“Polymelia” is the idea of extending the human body as a whole, looking forward to a future body that will consists an improved version of our current one, with enhanced capabilities, more resistance and strength. Our brain normally recognises our given by birth body schema as the standard body structure, and each one of its functions follows that schema. There is lot of research around the case of amputation or the phantom limbs, and a neurological and sociological analysis of the patient’s after those incidents life. However, we don’t know much about the case of polymelia. How our brain could possibly recognises the extra limbs or parts in general, how easily it will adapt to the new prosthetic exoskeletical body, and what will be the impact on a more psychological and sociological basis?
The oculus glass consists an important part of our project. It is a mean of connecting our reality with the visual realities that we aim to add in our concept, to create a network of augmented realities. An initial attempt, is to communicate emotions between people, through an AR, that is mainly a representation of colorful lines that expresses specific emotions.
Our main concept is built on the idea that nowadays, people have lost their physical contact. So we design an eye device and challenge people to come closer to each other. As soon as two people acquire physical contact, and touch each other, their ear-clip heart sensor starts rating values. These values are transformed into colors, based on each emotion that they represent, and create lines that compose connections between people. The oculus allows you to be able to see these lines.
Starting with the head, we focus on extending the sense of vision as part of our general development of our visual spatial perception. Our initial concept is to provide 360 degrees of vision around the perimeter of the head. We build a very simple device that is consisted of three layers and two structures, the esoskeletical and the exoskeletical one. The first layer is the oculus glasses. Basically, these glasses operate more like a case for our mobile device. The second and third layer are combined together as part of the exoskeletical structure, and they are consisted by two cameras (representation of our eyes), and a surface that covers the face and the glasses (as a representation of our facial skin).
The whole device is connected with the esoskeletical structure through a stepper motor, that is allowing the rotation of the device, and its direction of rotation is being controlled by two flex sensors that are placed on the two pointer fingers of the wearer.
The operation of this device is also simple. The wearer can control the direction of his vision rotation and choose what to see, through bending his two fingers. Then, the exoskeletical structure is rotating, and he can see exactly what the two cameras are capturing. The cameras are sending two separate videos on the mobile device’s screen and the oculus glasses are transforming this videos into a united 3D space to the eyes of the viewer.
Our aim is to continue on upgrading every part of the body that is in the need of improvement. That might includes factors not only related with the physical development of the body, but even more with the process of experiencing the world.
As part of the Polymelia Project, we moved on to developing the PolyLimbs, which focuses on enhancing the limbs capabilities in particular. We examined the anatomy of the human body and how every part [muscles, joints, limbs, etc.] reacts and differentiates during motion. The general aim was to combine gestures with emotions and search how by changing each side [emotions-gestures] we can affect the other side.
At first we 3D scanned our bodies and create our ‘doubles’ to gain a better understanding of our body measurements and modulation under a specific movement. Thereafter, we acquired a Kinect experiment on gait, to capture multiple people’s skeletons and analyze the differences between the body structure during motion. For that purpose we used obstacles of multiple shapes and sizes, jointed on the body in order to prevent someone’s typical gait process. One idea to apply these results was to create wearable limb prosthesis, in order to protect our privacy against the future media methods of tracking an individual through his gait. We attained this by combining soft-robotics and electircal nerve stimulation, to result an involuntary spasmodic movement, while one is entraining the tracking area.
So far we have accomplished to design wearable prosthesis, taking into account the spatial perception and the way it affects human behavior, emotion analytics, our senses, the body structure and movement, and concluded in three different approaches -sensing network, extended vision, control the motion/gait. The Polymelia Suit, aims to combine these three elements -emotion/sensing, vision, motion/gait- and joint them together in a wearable suit, also capable of being separated in its individual pieces, where each one of them will have the ability to operate independently as well.
This suit, supports an alternative way for the future of communications, and its use can be adapted for multiple uses. It has the capability to share sensing stimulus to the body -through soft robotics-, the limbs can mimic specific movements, and the plus element is the audio-visual device, that offers, besides the real-time extended vision, the possibility of sharing -or receiving- your hearing and vision. Beyond the interactive enhancement part, Polymelia can also be used for entertainment -through AR-, in gaming , and for medical applications -increase the protection of people with disabilites remotely or help them to improve their living standards through mimicry i.e. simple every day activities etc.
M.Merleau-Ponty, Phenomenology of Perception, Routledge, London, 1992
Sigmund Freud, Beyond the Pleasure Principle, Penguin UK, 2003
Sigmund Freud, The Unconscious, 1915
Sigmund Freud, An Outline of Psycho-Analysis, 1938
Anthony Vidler, The Architectural Uncanny: Essays in the Modern Unhomely, The MIT Press, Cambridge, Massachusetts, London, England
Juhani Pallasmaa, Eyes of the Skin: Architecture and the Senses, Wiley-Academy, Great Britain, 2005
Jacques Alain Miller, Culture/Clinic 1: Applied Lacanian Psychoanalysis, University of Minnesota Press, Saint Paul, 2013
Yalom Irvin D, Existential Psychotherapy, 1980
Mark Wigley, Prosthetic Theory: the disciplining of architecture, 1991
Sungmee Park, Sundaresan Jayaraman, Enhancing the quality of life through wearable technology, 2003
Ronald T.Azuma, A survey of augmented reality, 1997
Andy Clark, David J. Chalmers, The Extended Mind, 1998
Henry Head, Studies in Neurology, 1920
Giovanni Berlucchi, Salvatore Aglioti, The body in the brain: neural bases of corporeal awareness
Edward T.Hall, The Hidden Dimension, Anchor books, 1990
Russell A. Ligon and Kevin J. McGraw, Chameleons communicate with complex colour changes during contests: different body regions convey different information, 2013
Stuart-Fox D, Moussalli A,W Camouflage, Communication and Thermoregulation: Lessons from Colour Changing Organisms, 2008
Marius V. Peelen, Anthony P. Atkinson, and Patrik Vuilleumier, Supramodal Representations of Perceived Emotions in the Human Brain
Jasmien Herssens and Ann Heylighen, Haptics and vision in architecture: designing for more senses
Dennis R. Proffitt, Embodied Perception and the Economy of Action, University of Virginia
Behrens, R. R. (1999), The role of artists in ship camouflage during world war I. Leonardo 32, 53–59.
Callois, R (1984), Mimicry and Legendary Psychasthenia, October, 1 December 1984, Vol.31, pp.17-32
De Montalembert, M. ; Mamassian, P.(2010), The vertical–horizontal illusion in hemi-spatial neglect, Neuropsychologia, 2010, Vol.48(11), pp.3245-3251
Gregory,R. L. (1997),Eye and brain: The Psychology of Seeing . Princeton university press, 1997
Gregory, R. L.(2009), Seeing through illusions. Oxford University Press, 2009
Higashiyama.A (1979), The perception of size and distance under monocular observation, Perception & Psychophysics, 1979, Vol.26(3), pp.230-234
Land F. M., Tatler W. B.(2009), Looking and Acting : Vision and Eye Movements in Natural Behaviour, Oxford : Oxford University Press c2009
Leach, N (2006), Camoufage, the MIT press
Martin, E. and Hine, F. (2014), A Dictionary of Biology (6 ed.), Oxford University Press
Noe, A. and Thompson, E (2002), Vision and Mind: Selected Readings in the Philosophy of Perception, the MIT Press
Pasteur, G. (1982), A Classification Review of Mimicry Systems, Annual Review of Ecology and Systematics, Vol. 13: 169-199
Jean Claude de Mauroy, Jean François Salmochi, Tensegrity and Spine: A new biomechanical model
Angelo Maravita, Charles Spence, Jon Driver, Multisensory Integration and the Body Schema: Close to Hand and Within Reach, 2003
Featured Lab Project
Featured Lab Project
Chryssa Varna joined the Interactive Architecture Lab with an interest in bringing Dance and Architecture together. Her fascinating project between 2012-2013 brought together these two worlds through ...