The installation ‘ 12m4s ‘ by lab-au is an architectural intervention using visitor’s movements to generate out of its main parameters such as visitor position, orientation and speed a visual ( 3d particles ) and sonic ( granular synthesis) scape.
The installation is based on two tracking techniques such as image recognition and ultra sound sensors, to compute out of the tracked data a space of sound and movement in real-time. Here the different spatial, sonic and visual data fuses to one common construct where on the one hand the image recognition is used to create spatial sounds and where on the other hand the sonic sensors are used to create a visual echograms of the space. These sonic and visual representations of the space display visitor movements and create also a common vector field according to which all these elements evolve.
The result, this ‘particle synthesis’ is projected on a mylar screen fusing projection and reflection while building a common space in between the digital and the body space.
It was a great pleasure to see William Bondin graduate with distinction for a project which examined fascinating questions about the relationship between morphology and behaviour. Regine at We Make ...