The installation â€˜ 12m4s â€˜ by lab-au is an architectural intervention using visitor’s movements to generate out of its main parameters such as visitor position, orientation and speed a visual ( 3d particles ) and sonic ( granular synthesis) scape.
The installation is based on two tracking techniques such as image recognition and ultra sound sensors, to compute out of the tracked data a space of sound and movement in real-time. Here the different spatial, sonic and visual data fuses to one common construct where on the one hand the image recognition is used to create spatial sounds and where on the other hand the sonic sensors are used to create a visual echograms of the space. These sonic and visual representations of the space display visitor movements and create also a common vector field according to which all these elements evolve.
The result, this â€˜particle synthesis’ is projected on a mylar screen fusing projection and reflection while building a common space in between the digital and the body space.