All currently available multitouch technology relies on hand-eye coordination to interact with objects on the screen. For this reason, real-world applications are limited to gadgety interactions, e.g. on the iPhone. In such situations, the user generally wants to see what is happening graphically on the screen.
In other applications, the setup looks quite different. In a music performance, staring at the screen while manipulating parameters is not the best way to appear in front of an audience. Here, tactile control has to work without always having an eye for the on-screen object, because the eye tends to be needed for the crowd.
For this reason, the good old-fashioned machine-control paradigm still dominates the interface market for music applications. Knobs, buttons, and faders are still the mainstay on today’s stages for electronic music. But the versatility of multitouch screens is no less desirable in music equipment design. Attemps to match the benefits of both approaches are scarce. Just recently, though, a new and promising approach has surfaced, as published in MIT’s Technology Review.
This technological concept uses a latex screen with multitouch capability and a set of pneumatic pumps able to create small air pockets underneath the screen. These dynamic buttons can be either positive or negative, and the pressure when pushing them can be recorded as well. This looks very interesting for dynamically changing music composition and improvisation systems, although the proof-of-concept implementations are mostly about telecommunication.