Tangible Music Interfaces pt 2: Tables

Tangible User Interfaces (TUIs) combine control and representation within a physical artefact [1]. Interactive interfaces are based on tables, fiducials, tokens, computer vision, custom hardware and other bits. 

The AudioPad is a early musical interface table reportedly the first musical table interface. It uses proximity to control various actions so similar markers have multiple use cases. This is best understood by watching the video. 

Their blurb summarises it like so: 

"Audiopad is a composition and performance instrument for electronic music which tracks the positions of objects on a tabletop surface and converts their motion into music. One can pull sounds from a giant set of samples, juxtapose archived recordings against warm synthetic melodies, cut between drum loops to create new beats, and apply digital processing all at the same time on the same table. Audiopad not only allows for spontaneous reinterpretation of musical compositions, but also creates a visual and tactile dialogue between itself, the performer, and the audience."

Since then they have gone onto establishing a creative technology studio, doing alot of very interesting work 

Though somewhat stating the obvious, tangible interaction in music is refreshing addition to the spectrum of sonic control possibilities. Just the act of not sitting in front of a traditional screen can allow you to sway and react to the music more freely. This then increases your engagement while also focussing yourself on the task of fresh manipulations. A key feature of this abstracted objectified interface is the process of engaged learning. As objects have intrinsic physical affordances, the control space can be explored in a natural trial and error method. While engaged activity is important for learning so too are periods of disengaged reflection. A table allows this as you can simply step back and observe.

One of the most widely known tangible music tables is that of reactable. The Reactable is built upon a tabletop interface, which is controlled by manipulating tangible acrylic pucks on its surface. By putting these pucks on the Reactable’s translucent and luminous round surface, by rotating them and connecting them to each other, performers can combine different elements like synthesizers, effects, sample loops or control elements in order to create a unique and flexible composition [2]. An interesting revelation came from the creators of reactable, namely that  music performance and control can constitute an ideal source of inspiration and test bed for exploring novel ways of interaction, especially in highly complex, multidimensional and continuous interaction spaces such as the ones present when browsing huge multimedia databases. This type of interaction involves searching in complex, hyperpopuladed and multi-dimensional spaces, often looking for unknown and probably not single targets. In these types of “fuzzy interaction” environments, exploration can follow infinite paths, results can hardly be totally right or wrong, and the evaluation metrics can become so unclear, that joyful and creative use may become one of the essential assets [3].

Key design criteria are essential for not turning these interfaces into pretty messes. Sheridan et al offer the following advice in the design of performative tangible interaction [4], systems should be:

1 Intuitive – allow people to quickly grasp an understanding of the basic elements of the interaction, rather than being aimed at expert performers.
2 Unobtrusive – allow the public to carry on their normal activities if they choose to.
3 Enticing – encourage spontaneous interaction by passers-by without any, or very little, instruction.
4 Portable – are lightweight and low power, and easily transported, set up and taken down.

5 Robust – can withstand, and recover from, a range of environmental conditions such as adverse weather and changeable lighting, and different forms of interaction.
6 Flexible – can be dynamically tailored to the environment in which they are deployed.

Key design criteria include: visibility, controllability, robustness and responsiveness. It is theorised that if guidelines are followed and systems are engineered correctly such interfaces allow novice users to quickly learn the performance frame and be able to enjoy creative experiences with the device. Which is nice. Posing design questions in terms of the performance frame (technical skills, wittingness, interpretive abilities) and witting transitions can drive design decisions and maintain a balance between the technology and performance. 

Though working in pure VR for the current project, such design studies and artefacts, inform the nature of how enjoyable ‘fun’ interaction could be shaped. Within a VR environment every array of possibilities are conceivable, but when interacting with music, many of these possibilities must be culled. The ability to create new custom environments based around tangible interaction but free from certain physical restraints (just annoying ones like gravity) allow for creation of seemingly tangible interactions, tangible in terms of interacting with objects rather than a physical interaction. Though a major problem still exists, namely that the physical link presented in tangible interaction creates a sensory flow of information that can guide decisions for the user. Without a haptic feedback channel, will VR ‘tangible’ music interfaces just fall down? I hope not.


[1] B. Ullmer and H. Ishii, “Emerging frameworks for tangible user interfaces,” IBM Syst. J., vol. 39, no. 3.4, pp. 915–931, 2000.

[2] S. Jordà, G. Geiger, M. Alonso, and M. Kaltenbrunner, “The reacTable,” Proc. 1st Int. Conf. Tangible Embed. Interact. – TEI ’07, p. 139, 2007.

[3] S. Jordà and G. Geiger, “The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces,” … Embed. Interact., 2007.

[4] J. Sheridan and N. Bryan-Kinns, “Designing for Performative Tangible Interaction,” vol. 1, pp. 288–308, 2008.

Sound Ideas

Virtual reality and interactive media applications are a great testing ground for novel audio applications. Bizarre control dynamics and immersive feedback stimulus make for an engaging method of challenge and reward. But a question of what areas of sound are to be manipulated remains open, normal synthesis algorithms are build on the morphological relationships related to their early construction… they were made for keyboards and circuits. With natural user interfaces it is possible to explore more physical relationships to sound control. Along this line I found some nice visuals and sounds of physical acoustic relationships. Whether these can be implemented in a controllable and rewarding fashion is another question…

Tangible Music Interface pt 1: Cubes!

Nice cubes eh, menger

I really like cubes, no really. This first post is just a statement really, the future posts in this title thread will cover a variety of tangible musical interfaces, such as Reactable. Though working in VR for the current project the user research and design considerations in tangible interfaces share alot of crossovers. In this series of posts I will hope to determine how these areas can be mutually informed. But for now I want to look at some cubes.

Cube Illusions

First up is a Rubiks cube puzzle sequencer that uses computer vision based markers as a means of controlling the underlying musical structures. I really like this playful and mysterious approach to discovering musical elements. Check out his other work and talks, really interesting chap.

This is pretty fly

Some WebGL graphics stuff

AUTOMATParticularly like the exposing of the mesh underneath in this piece,  use your gyroscope or mouse to interact with it. Giving me ideas for objects to spontaneously appear in VR experience, representing a metaverse of acoustic objects, as the become used in a context: making a phone call, streaming a tune, playing with something.

Waves Wave simulator that made me think again about using basic acoustic formula for control of 3D graphics for reactive audio. Fished out my Kuttruff Room Acoustics book and subsequently melted my brain.

JavaScript: If you fancy playing with this sort of stuff go to PlayGnd (based on three.js), or just look through the archives with slinky code overlay. If thats abit softcore for you head over to WebGL Fundamentals and get all detailed. Also a fully speced WebGL and js library is that of Babylon.js that supports super easy Oculus Rift integration, check out their playground for examples and browser based IDE.