This page uses content from Wikipedia and is licensed under CC BY-SA.
Sonic interaction design
Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.
Perceptual, cognitive, and emotional study of sonic interactions
Research in this area focuses on experimental scientific findings about human sound reception in interactive contexts.
During closed-loop interactions, the users manipulate an interface that produces sound, and the sonic feedback affects in turn the users’ manipulation. In other words, there is a tight coupling between auditory perception and action. Listening to sounds might not only activate a representation of how the sound was made: it might also prepare the listener to react to the sound. Cognitive representations of sounds might be associated with action-planning schemas, and sounds can also unconsciously cue a further reaction on the part of the listener.
Sonic interactions have the potential to influence the users’ emotions: the quality of the sounds affects the pleasantness of the interaction, and the difficulty of the manipulation influences whether the user feels in control or not.
Product sound design
Product design in the context of sonic interaction design is dealing with methods and experiences for designing interactive products having a salient sonic behaviour. Products, in this context, are either tangible and functional objects that are designed to be manipulated, or usable simulations of such objects as in virtual prototyping. Research and development in this area relies on studies from other disciplines, such as:
sound culture, i.e. the study of how the production and consumption of sound have changed throughout history and within different societies.
In design research for sonic products a set of practices have been inherited from a variety of fields. Such practices have been tested in contexts where research and pedagogy naturally intermix. Among these practices it suffices to mention:
bodystorming, especially when combined with vocal sketching, where participants produce vocal imitations to mimic the sonic behavior of objects while they are being interacted with;
theatrical practices, such as theatrical metaphors and dramatic performance;
In the context of sonic interaction design, interactive art and music projects are designing and researching aesthetic experiences where sonic interaction is in the focus. The creative and expressive aspects – the aesthetics – are more important than conveying information through sound. Practices include installations, performances, public art and interactions between humans through digitally-augmented objects/environments. These often integrate elements such as embedded technology, gesture-sensitive devices, speakers or context-aware systems.
The experience is in the focus, addressing how humans are affected by the sound, and vice versa. Interactive art and music allows us to question existing paradigms and models of how we interact with technology and sound, going beyond paradigms of control (human controlling a machine). Users are part of a loop which includes action and perception.
Interactive art and music projects invite explorative actions and playful engagement. There is also a multi-sensory aspect, especially haptic-audio and audio-visual projects are popular. Amongst many other influences, this field is informed by the development of the roles of instrument-maker, composer and performer merging.
Sonification is the data-dependent generation of sound, if the transformation is systematic, objective and reproducible, so that it can be used as scientific method.
For sonic interaction design, sonification provides a set of methods to create interaction sounds that encode relevant data, so that the user can perceive or interpret the conveyed information. Sonification does not necessarily need to represent huge amounts of data in sound, but may only convey one or few data values in a sound. To give an example, imagine a light switch that, on activation would create a short sound that depends on the electric power consumed through the cable: more energy-wasting lamps would perhaps systematically result in more annoying switch sounds. This example shows that sonification aims to provide some information by using its systematic transformation into sound.
The integration of data-driven elements in interaction sound may serve different purposes:
to allow the users to refine their actions via auditory feedback. Example: the sonification-enhanced drilling machine which indicates by sound when a wanted orientation to the wall is reached.
to create a sonic gestalt for the interaction which allows users to compare the detailed performance on repeated interactions: for instance rowing strokes may be sonified so that the sportsmen can better synchronize their action.
to enable novel functions that would otherwise be not available (e.g. a bottle that displays by sound how much fluid is poured into glasses so that the users can more easily fill the equal amount of liquid in different glasses).
Within the field of sonification, sonic interaction design acknowledges the importance of human interaction for understanding and using auditory feedback. Within sonic interaction design, sonification can help and offer solutions, methods, and techniques to inspire and guide the design of products or interactive systems.
^Davide Rocchesso, Stefania Serafin, Frauke Behrendt, Nicola Bernardini, Roberto Bresin, Gerhard Eckel, Karmen Franinović, Thomas Hermann, Sandra Pauletto, Patrick Susini, and Yon Visell, (2008). Sonic interaction design: sound, information and experience. In: CHI '08 Extended Abstracts on Human Factors in Computing Systems (Florence, Italy, April 05 – 10, 2008). CHI '08. ACM, New York, NY, 3969–3972. doi:10.1145/1358628.1358969
^Davide Rocchesso and Stefania Serafin, (2009). "Sonic Interaction Design". Editorial of Special Issue. International Journal of Human–Computer Studies 67(11) (Nov. 2009): 905–906. doi:10.1016/j.ijhcs.2009.09.009
^Guillaume Lemaitre, Olivier Houix, Yon Visell, Karmen Franinović, Nicolas Misdariis, and Patrick Susini, (2009). "Toward the design and evaluation of continuous sound in tangible interfaces: The Spinotron". International Journal of Human–Computer Studies 67(11) (Nov. 2009): 976–993. doi:10.1016/j.ijhcs.2009.07.002
^Salvatore M. Aglioti and Mariella Pazzaglia (2010). "Representing actions through their sound". "Experimental Brain Research" 206(2): 141–151. doi:10.1007/s00221-010-2344-x
^Marzia De Lucia, Christian Camen, Stephanie Clarke, and Micah M. Murray (2009). "The role of actions in auditory ob ject discrimination". "Neuroimage" 48(2): 475–485. doi:10.1016/j.neuroimage.2009.06.041
^Guillaume Lemaitre, Olivier Houix, Karmen Franinović, Yon Visell, and Patrick Susini (2009). "The Flops glass: a device to study the emotional reactions arising from sonic interactions". In Proc. Sound and Music Computing Conference, Porto, Portugal. Available: 
^Daniel Hug (2008). Genie in a Bottle: Object–Sound Reconfigurations for Interactive Commodities. In: Proceedings of Audiomostly 2008, 3rd Conference on Interaction With Sound (2008). Available: online
^Karmen Franinović, Daniel Hug and Yon Visell (June 2007). Sound Embodied: Explorations of Sonic Interaction Design for Everyday Objects in a Workshop Setting. In: Proceedings of the 13th International Conference on Auditory Display, Montréal, Canada, June 26 – 29, 2007, pp. 334–341. Available: online
^Richard H. Lyon, (2003). "Product sound quality-from perception to design". Sound and Vibration 37(3): 18–22. Available: online
^Inger Ekman and Michal Rinott, (2010). Using vocal sketching for designing sonic interactions, Aarhus, Denmark: Designing Interactive Systems archive,
Proceedings of the 8th ACM Conference on Designing Interactive Systems, ISBN978-1-4503-0103-9. Available: online.
^Sandra Pauletto, Daniel Hug, Stephen Barrass and Mary Luckhurst (2009). Integrating Theatrical Strategies into Sonic Interaction Design. In: Proceedings of Audio Mostly 2009 – 4th Conference on Interaction with Sound (2009), Glasgow, 6 p. Available: PDFArchived 2011-08-27 at the Wayback Machine. and online
^Davide Rocchesso, Pietro Polotti, and Stefano delle Monache, (28 December 2009). "Designing Continuous Sonic Interaction". International Journal of Design 3(3). Available: online and PDF
^Wendy E. Mackay and Anne Laure Fayard, (1999). Video brainstorming and prototyping: techniques for participatory design, Pittsburgh, Pennsylvania: Conference on Human Factors in Computing Systems, CHI '99 extended abstracts on Human factors in computing systems, ISBN1-58113-158-5. Available: online
^The fifth International Workshop on Haptic and Audio Interaction Design (HAID) on September 16–17, 2010 in Copenhagen, Denmark. [media.aau.dk]
^International Conference on New Interfaces for Musical Expression [www.nime.org]
^John Thompson, JoAnn Kuchera-Morin, Marcos Novak, Dan Overholt, Lance Putnam, Graham Wakefield, and Wesley Smith, (2009). "The Allobrain: An interactive, stereographic, 3D audio, immersive virtual world". International Journal of Human-Computer Studies 67(11) (Nov. 2009): 934–946. doi:10.1016/j.ijhcs.2009.05.005
^Tobias Grosshauser and Thomas Hermann, (2010). "Multimodal closed-loop Human Machine Interaction" Proc. Interactive Sonification Workshop, Stockholm, [interactive-sonification.org]
^Thomas Hermann, and Andy Hunt, (2005). "Guest Editors' Introduction: An Introduction to Interactive Sonification". IEEE MultiMedia 12(2): 20-24. doi:10.1109/MMUL.2005.26
Stefano Delle Monache, Pietro Polotti, Davide Rocchesso (2010). A Toolkit for Explorations in Sonic Interaction Design. In: Proceedings of the 5th Audio Mostly Conference: A Conference on Interaction with Sound, 2010, New York (AM '10), ISBN978-1-4503-0046-9, doi:10.1145/1859799.1859800, citation
Eoin Brazil and Mikael Fernström, (2009). Empirically Based Auditory Display Design. In: Proceedings of the SMC 2009 - 6th Sound and Music Computing Conference, 23–25 July 2009, Porto, Portugal. Available: online.
Karmen Franinović, Yon Visell, Daniel Hug, (2007). Sound Embodied: A Report on Sonic Interaction Design in Everyday Artifacts. In: Proceedings of the 13th International Conference on Auditory Display, Montréal, Canada, June 26–29, 2007. Available: online.
Ernest A. Edmonds, Alastair Weakley, Linda Candy, Mark Fell, Roger Knott, and Sandra Pauletto, (2005). "The Studio as Laboratory: Combining Creative Practice and Digital Technology Research". International Journal of Man-Machine Studies 63(4–5): 452–481. Available: online.
Ernest Edmonds, Andrew Martin, and Sandra Pauletto, (2004). Audio-visual Interfaces in Digital Art. In: Proceedings of the 2004 ACM SIGCHI International Conference on Advances in computer entertainment technology (ACE '04), Singapore, June 3–5, 2004, pp. 331–336, doi:10.1145/1067343.1067392. Available: online.
Thomas Hermann and Andy Hunt, (2004). The Importance of Interaction in Sonification. In: Proceedings of ICAD Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6–9, 2004. Available: online.
Niklas Röber and Maic Masuch, (2004). Interacting With Sound: An Interaction Paradigm for Virtual Auditory Worlds. In: Proceedings of ICAD Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July 6–9, 2004. Available: online.