Tangible interfaces

In summer, I wrote about a framework for reality-based interaction for which I wanted to give a few examples. Although the term reality-based interaction is not mentioned, I think the design ideas presented in two recent publications of the MIT Media Lab team around Hiroshi Ishii fit very well into that category.

The first paper is Simplicity in Interaction Design by Chang et al. which reports on a design exercise conducted at the Media Lab to encourage students to design expressive but simple means for representing information of common devices found in households with the use of a very limited set of interface components. This exercise resulted out of the observation that many interfaces are overloaded with “buttons and blinking lights (B.A.B.L.)” which represent information about complex state machines, such as those found in modern answering machines, in a manner that is too complex for users to understand. The authors argue that this is a result of designers chosing features over usability which, as a consequence, requires ever more complex user interfaces. The primary constraint of the exercise to redesign such a device was therefore set to allow for a maximum of one input and output mechanism, e.g. one button and one LED. With this, the authors state, they hoped to force students to prioritize features and leave out those not representing a core functionality. Before conducting the exercise in a classroom setting, the authors completed the exercise once themselves.

Of the results described in the paper, I found the design of a simple answering machine particularly inspiring: It consists of a bowl-shaped base which is overstretched by a membrane. For every incoming call recorded on the device, the volume covered by the membrane expands slightly more, causing the membrane to bulge outwards. To play back the recorded calls, a user simply applies pressure to push the membrane back down into the bowl. To rewind a recording for a few seconds, it is sufficient to briefly pull the membrane back out. In essence, Chang et al. argue, these restrictions helped them to focus largely on using mechanical change to modify and visualize a device’s state and overall, they conclude that such forced simplicity can in fact encourage finding novel interaction techniques and foster new innovations. Personally, I see significant potential in the marketability of such simple devices and would be happy to see them showing up in stores. It is the instant comprehensibility of the interactions which, in my opinion, would find a big audience of buyers.

No messages, a few messages, several messages, playback of messages

Different states of a tangible answering machine. From left to right: No messages, a few messages, many messages, playback of messages

The second paper Pragmatic haptics by the same authors follows a similar lead as the first but focuses more specifically on how haptic device in- and output can simplify interactions. Specifically, the authors’ motivation was to close the interaction loop for the haptic modality (i.e. allow for haptic in- and output). They argue that this would then allow for moving suitable aspects of interactions which are currently realized using the often overloaded audio and visual senses, to haptic interaction. The final goal for this would be to reduce a user’s cognitive load during execution of a task and to simplify the required interactions.

The one example I found most convincing in the paper was that of a handheld power tool, in this case a jigsaw used for cutting wooden boards. Here, the task of sawing along a curved line could be simplified by haptic interaction. To achieve this, the tool would have to be modified to rock towards the line to follow whenever the user starts to deviate from the drawn course. This, the authors point out, would allow the eyes to focus much more on the grander scheme of the shape to follow rather than having to stay focused almost exclusively on the immediate changes in the path. Finally, one of the recommendations for designers they give, which I think corresponds best to this example, is that interaction designers should aim at freeing the eyes from intensive visual tasks even if it means introducing an unwieldy haptic task. Overall – that is the implicit recommendation I read out – this change will still represent an improvement for the user.

Other examples for reality-based interaction

In addition to the two examples of recent research above, I want to link to a few more practical approaches to providing reality-based interaction.  To recap, tangible interfaces allow for direct interaction with a system through the modification of physical objects. The examples here are digital music sequencers which are controlled through the location of physical objects on a grid. By using simple image capturing equipment such as a webcam, the computer is provided with the necessary input to determine the location of objects. Based on this information the computer then generates specific sounds at specific points in time. In a music sequencer, one grid dimension generally represents the time while the second dimension can be used for layering multiple sounds, or depending on the implementation, to denote individual instruments. Complex arrangements of objects can thus result in equally complex and rich music.

One of the older presentations of camera-based sequencers I could find is Audio d-touch, specifically the Tangible Virtual Drum Machine Interface of this project, which was presented by Enrico Costanza et al. at the Intl. Conference on Digital Audio Effects (DAFX) in 2003.

The author’s website has several other video examples.

In 2007, students at the UC Berkeley developed a drum sequencer called Bubblegum Sequencer which follows the same basic premise as above but utilizes the colors of individual gums for triggering various types of drums.

For those who enjoying drinking beer, the Crown cap beat machine (2008) might be another alternative. Fun aside, I think this video is a good example that today, even rudimentary hardware is capable of enabling such applications.

Finally, many additional examples can be found at the MIT Media Lab’s Tangible Media Group. If you have found further creative designs of tangible or reality-based interfaces, I’d be happy to hear about them in the comments below.

Update:

I just discovered that the paper on reality-based interaction has been put online by its authors.

Advertisements

This blog is no longer actively maintained

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s