Musical sonification

Student and teacher in front of computer screens
There are several possible challenges for the perception in visual representations. Musical sonification can support perception of intensity levels. Here, a participant in a research study is instructed about musical elements in relation to intensity levels in the visual representation on the screen.

Sound can be used to simplify understanding of a visual representation of research data, or to improve the outcome of an interaction between human and a machine. To use sound in this way is called sonification.

We do research in musical sonification. Sonification means that sound is added as a complement to a visual representation to make the representation easier to comprehend, or to show new relationships in research data, or to increase the outcome of an interaction. Focus for our research is to explore the use of deliberately designed and composed musical sounds in sonification. Sonification can be used for sonifying measurements about climate change, bio sensor data in medical sciences, process control and transport management, and much more. A close collaboration with domain experts is needed for the sonification to be useful with regards to application and evaluation of the sonification. We welcome collaborations and suggestions for application areas for sonification.

Sonification and information visualization

Research produce data within many different research areas from medicine via decision support for air traffic control to climate change. These research results are usually presented for other researchers as well as the public via scientific texts and visual representations. However, these visual representations are often hard to comprehend and load the visual cognitive system. By using sound as a complementary modality, it should be able to ease the interpretation of the visual information. For example, sonification can reduce visual misinterpretations made by simultaneous brightness contrast, ease understanding of density in a visualization, or distinguish between different datasets shown in the same visual representation.

Sonification and decision support

Sonification might also be used in relation to automation and decision support. Within air traffic control, as an example, sonification can provide peripheral information that the air traffic controller does not have available on the visual display. Sonification could also provide auditive information about, for example, the status of different machines. This auditive information was something that technicians and operators were provided directly on the machine shop floor previously but is not available in today's quiet control rooms. Sonification might in these examples create a soundscape that provides constant available status information in the background, without forcing the operator to shift focus from ongoing work tasks.

Sonification and interaction

In a situation where a user interacts with a machine, sonification might be used to provide information to the user. This information might be transmitted by short auditive feedback, so called interface sounds, or by changes in a more holistic soundscape. Sonification might in such ways either directly affect the behavior of a user or create an immersive environment where the user's experience of the interaction is reinforced.

Musical elements in sonification

Basic research in sonification includes exploration and evaluation of the use of musical elements and structures, such as harmonics, pitch, amplitude, and tempo. We investigate what musical elements that are applicable to use in sonification and how these are experienced by users. Furthermore, we study what visual or interactive elements that are suitable for sonification, and the most appropriate mapping between these and changes of the musical elements in the sonification, that provide the best results and user experience.

Contact

Staff

Publications

2024

Niklas Rönnberg, Ahmet Börütecene (2024) Use of Generative AI for Fictional Field Studies in Design Courses Adjunct Proceedings of the 2024 Nordic Conference on Human-Computer Interaction, Article 23 (Conference paper) Continue to DOI
Elias Elmquist, Malin Ejdbo, Alexander Bock, David S. Thaler, Anders Ynnerman, Niklas Rönnberg (2024) Birdsongification: Contextual and Complementary Sonification for Biology Visualization Proceedings of the International Conference on Auditory Display, p. 34-41 (Article in journal) Continue to DOI
Kajetan Enge, Elias Elmquist, Valentina Caiola, Niklas Rönnberg, Alexander Rind, Michael Iber, Sara Lenzi, Fangfei Lan, Robert Höldrich, W. Aigner (2024) Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and Visualization Computer graphics forum (Print), Vol. 43, Article e15114 (Article in journal) Continue to DOI
Niklas Rönnberg (2024) Where Visualization Fails, Sonification Speaks
Ivar Gorenko, Lonni Besançon, Camilla Forsell, Niklas Rönnberg (2024) Supporting Astrophysical Visualization with Sonification