Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top

Top

No Comments

IS – Personalised Immersive Audio Visual Experience – Work in Progress

IS – Personalised Immersive Audio Visual Experience – Work in Progress
  • On July 2, 2018
  • https://luyangzou.com/

IS is an installation that tries to answer a simple question — How to inhabit sound?

Most of the time, when we are perceiving sound or music, architectural environment defines how we perform in the space and affects our perception of music.

Our perception of music includes our auditory senses, our personal experiences and the architectural environment we are in. Hearing the same piece of music in different environments would change how we perform.

 

For instance, if a same piece of music played in a dance studio and at a lecture, in the dance studio, audiences will dance with such sound, but they would not do the same thing in the classroom of lecture. It is because we perceive the space before the music, and they know where they are. Moreover, as a human being in the society, people are trained to be a ‘ human ‘ based on the ‘rules’ to frame what they can do what they cannot according to where they are.

So, we want to create a spatial experience that you would not feel the physical space and what you see and hear are both about sound. Within such experience, the sound is simply composed by the participant, and there is not any other people inside just the participant and his or her music.

The user’s position data is translated into sound. Movement triggers the sound. Walking inside can change the frequency and the duration of the notes. The sound is mapped spatially to physical space using ambisonics, it follows the position of the users as they walk along. Sound generated by the user, is analysed for visualisation.

Visuals work as a clue to understand the control of the user over the system. When the user stands still, the sound that has been produced with movement is broken into grains, stretched and played back. While the visual clue of the previous sound is frozen and particles appear. User is the force driving these particles, that changes the location of the sound as they are moved along. To conclude, motion drives the sound, sound is translated into visuals and visuals are the representation of sound, visual clues stimulate the interaction that creates a loop of the system.

 

By using projection mapping and audio ambisonic mapping, audio and visual both can be hear and see spatially. Participant can really immerse inside of the music and visual created by hisself/herself. Playing with sound, showing real personality, without any intervention, this is IS.

 

 

 

 

Submit a Comment