Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top


No Comments

Tangible Music Visualiser

  • On April 23, 2018

This above video by Vincent Houze – created for the musical composition of Max Cooper – is one of the most successful examples I have seen visualising music digitally. The organism in the video breathes along the beat and vocal change as if it has come alive. Our enjoyment of the music instantly escalates to another level because we see new life birthed in front of our eyes.

This effect has been something I imagined to create for a long time. I often wondered about the original iTunes music visualiser, and why it never stuck around. I assumed people got bored quickly because of the predictability of the visual effects. Gordon Pask famously pointed out in his essay “A Comment, A Case History and A Plan” that:

[An aesthetically potent environment should] offer sufficient variety to provide the potentially controllable novelty required by a man (however, it must not swamp him with variety – if it did, the environment would merely be unintelligible).

Pask, therefore, created a cybernetic machine called Musicolour that explores this relationship between the performer and a lighting effect, which eventually forms a feedback loop that encourages the performer to diversify their creation.

At this point, I am still debating the extent of interactivity I want to incorporate in my design – it could easily become the topic of a few more posts – but what I have managed to experiment, is creating music visualisation from a physical, tangible interface. Here is one version that utilises delicate thin strands of optical fibres:


Going forward, I am also interested in testing more with the vibration motors operating beneath the sandbox. I am hoping to create a new form of tangible musical visualisation that responds to rhythm at the same time displays lifelike behaviours.

Submit a Comment