Thursday, February 14, 2013

Project Idea

Combining all knowledge and facts resulting from the conducted research so far, a first concrete project idea is created.

At this point, the final project will be a virtual reality installation to view and experience possibly the entire worldbank database, especially by the means of data sonification. This will be achieved by importing data sets from the worldbank's database into the game engine Unity to create stereoscopic 3D visuals and graphs. By the use of the worldbank's Developer's API, possibly the entire database can be implemented into the project. Furthermore, sonification and sound design for the data will be created and implemented in Unity, using the FMOD sound engine. When using the installation at the 3D Lab at Hochschule Darmstadt, head-tracking in combination with a 7.1 surround headset by Logitech will be implemented to create an even better immersive binaural sound experience. The interface between a Vicon tracking system with Unity under the focus on creating an interactive head related transfer function has already been created and is ready to use. However, the installation will not be depended on the Vicon tracking system and will work with any surround or stereo sound system.



The visual representation of the data will be relatively straight forward, using standard visual data representation methods. A major focus will lie on the data sonification in combination with the visuals, focusing on not only to increase the immersion of the virtual reality installation, but most importantly to  increase work flow and usability of the installation, as well as the quality of the data representation. To combine classic working with data with the new audio-visual VR data installation, the installation will be controlled through a tablet device, so users are still able to view and compare hard numbers and sheets. So while being entirely immersed into the abstract data, users can always fall back to classic representations and numbers.




Sunday, February 10, 2013

Unity, XML and Particles

To continue the technical research on how to visualize big data sets in Unity (previous blog post), different visualization styles where examined. 

My project's supervisor Thorsten Fröhlich pointed me towards a brilliant tutorial for creating graphs in Unity with particles on 'Catlike Coding'. This tutorial is extremely detailed, really clear and absolutely stunning. It can be found here. Many thanks to the website's hoster Jasper Flick!

By combining my new ideas gained from experimenting with this tutorial with my latest Unity project which imported and read a large *.XML file from the worldbank's database, a beautiful particle cloud came up that can be explored:




Though this visual representation is simple eye-candy and not really useful or readable in any way, it is a good starting point on developing visual representations and graphs from the worldbank's XML files in Unity.


Since the sound engine FMOD is integrated in Unity and documented in detail, technical difficulties when creating sound design and sonification design will probably be minor, so in terms of sonification methods and design, the focus can still lie on the conceptual layer for now.



Time Schedule


To organize the tasks and activities, the execution phase of the master work has been divided into the five months. Two main activities have to be completed: the technical implementation of the data and the mobile app remote control in Unity and the sonification design for the data. The technical implementation will start with a single data set. At the end, possibly the entire data base of the Worldbank Institute will be implemented, a fallback point however is the implementation of several important data sets. Data visuals for Unity and the tablet device have to be developed, as well as the graphical interaction interface for the Unity application.
A version for a single machine with an own interaction GUI is also planned, but its implementation is depended on the progress of the project during the working phase. From the third month on, at least two usability test will be held to keep the work user orientated. Depending on the input and time needed to implement the results, the second usability test will be held at the beginning of the fourth or the fifth month.
Sonification and sound design will simultaneously be developed during the entire five months and evaluated in the usability testings twice, as this part has the main priority beside technical implementation.
A Gantt Chart has been created to clearly set the tasks and deadlines for each month.


Wednesday, February 6, 2013

Sonification: A Definition

The term 'Sonification' is quite abstract and not really known to a broad majority, even including most auto-correct software :-) It is often misunderstood or used in the wrong context. To get a clear understanding what sonification is and (even more important) what it is not, several definitions and articles talking about sonification in general have been studied. 
First of all, the not scientific but quite commonly used and beloved source Wikipedia describes sonification as "the use of non-speech audio to convey information or perceptualize data", referencing to one of the most famous books in the field by Gregory Kramer: "Auditory Display: Sonification, Audification, and Auditory Interfaces". This definition is fairly similar to the actual definition of Auditory Display itself, which basically is transporting information through sound.
The paper by Thomas Herrmann "Taxonomy and Definition for Sonification and Auditory Display" goes deeper into the term's meaning and tries to define 'Sonification' in more detail:
"A technique that uses data input, and generates sound signals (eventually in response to additional excitation or triggering) may be called sonification, if and only if
  • The Sound reflects objective properties or relations in the input data
  • The transformation is systematic. This means that there is a precise definition provided of how the data (and optional interactions) cause the sound to change
  • The sonification is reproducible: given the same data and identical interactions (or triggers) the resulting sound has to be structurally identical
  • The system can intentionally be used with different data, and also used in repetition with the same data."

This clearly underlines the mathematical aspect of sonification, stressing the fact the the term describes a scientific method and is indeed not an artisitc approach towards data:
"Being a scientific method, a prefix like in 'scientific sonification' is not necessary. Same as some data visualizations my be 'viewed' as art, sonifications may be heard as 'music', yet this use differs from the original intent."
This is a definite statements and clearly separates data sonification from many musical projects working with scientific data. Though the line between artistic music projects and actual data sonification projects might be a bit blurry. The difference a visual piece of art related to or created from scientific data and an actual visual graph the clearly represents numbers is similar to the difference between music and sonification and makes it easier to differentiate:
"This automatically distinguishes signification from music, where the purpose is not in the precise perception of what interactions are done with an instrument or what data caused the sound, but on an underlying artistic level that operates on a different level."
So though both, music and sonification belong to the family of 'organized sounds', they have different purposes and functions which clearly separates them from each other. This actually excludes musical data projects mentioned in a previous post from data sonification.

It is highly important to keep those definitions and differences in mind when creating and designing the sounds and interactions for a sonification project, as the danger to drift off into a more musical and artisticc approach, rather than a scientific sonification project during the creative process is quite high and has to be recognized rather sooner than later.


References 

KRAMER, G. (1994). Auditory display: sonification, audification, and auditory interfaces. Reading, Mass, Addison-Wesley.


Hermann, T. (2008) Taxonomy and Definitions for Sonification and Auditory Display. Paris: International Conference for Auditory Display.

Monday, February 4, 2013

Thomas Hermann

A very known researcher and developer in the area of Auditory Display and Data Sonification is Thomas Hermann. A lot of his research has been conducted at the University of Bielefeld, which covers the topics of sonification, human-computer interaction or cognitive interaction technology including a Ph.D. in computer science. He is furthermore a member of the ICAD Board of Directors.



Many of his papers are highly relevant for project's research and have already been studied, such as "Taxonomy and Definitions for Sonification and Auditory Display" or "Listen to your Data: Model-Based Sonification for Data Analysis" (see here).

His probably most important written work on sonification is the book "The Sonification Handbook",  published together with Andy Hunt and John G. Neuhoff and giving "a comprehensive introductory presentation of the key research areas in the interdisciplinary fields of sonification and auditory display".




The book can be purchased on Amazon and there is also a free download version available as PDF. The content of this book is vital to successfully create any sonification project and will take a major part in both, research and practical phase of the project.

Furthermore, Thomas Hermann is hoster of the website Sonification.de, giving a great overview to the topic of sonification and auditory display, as well as his personal details and research.



References

Amazon.de (n.d.) The Sonification Handbook. [online] Available at: http://www.amazon.de/The-Sonification-Handbook-Thomas-Hermann/dp/3832528199 [Accessed: 3 Feb 2013].
Sonification.de (2010) sonification.de. [online] Available at: http://sonification.de/ [Accessed: 3 Feb 2013].

Tuesday, January 29, 2013

Sonification Research

Theoretical research on sonification has been conducted during the last weeks. Papers from various ICAD conferences, as well as other scientific papers with related topics have been examined.

Examined papers for research where among others:

Taxonomy and Definitions for Sonification and Auditory Display by Thomas Hermann [PDF]

ABSTRACT
Sonification is still a relatively young research field and many terms such as sonification, auditory display, auralization, audification have been used without a precise definition. Recent developments such as the introduction of Model-Based Sonification, the establishment of interactive sonification and the increased interest in sonification from arts have raised the need to revisit the definitions in order to move towards a clearer terminology. This paper introduces a new definition for sonification and auditory display that emphasizes the necessary and sufficient conditions for organized sound to be called sonification. It furthermore suggests a taxonomy, and discusses the relation between visualization and sonification. A hierarchy of closed-loop interactions is furthermore introduced. This paper aims to initiate vivid discussion  towards the establishment of a deeper theory of sonification and auditory display.


An Interface and Framework Design for Interactive Aesthetic Sonification by Beilharz Kirsty and Ferguson Samuel [PDF]

ABSTRACT
This paper describes the interface design of our AeSon (Aesthetic Sonification) Toolkit motivated by user-centred customisation of the aesthetic representation and scope of the data. The interface design is developed from 3 premises that distinguish our approach from more ubiquitous sonification methodologies. Firstly, we prioritise  interaction both from the perspective of changing scale, scope and presentation of the data and the user's ability to reconfigure spatial panning, modality, pitch distribution, critical thresholds and granularity of data examined. The user, for the majority of parameters, determines their own listening experience for real-time data sonification, even to the extent that the interface can be used for live data-driven performance, as well as traditional information analysis and examination. Secondly, we have explored the theories of Tufte, Fry and other visualization and information design experts to find ways in which principles that are successful in the field of information visualization may be translated  to the domain of sonification. Thirdly, we prioritise aesthetic variables and controls in the interface, derived from musical practice, aesthetics in information design and responses to experimental user evaluations to inform the design of the sounds and display. In addition to using notions of meter, beat, key or modality and emphasis drawn from music, we draw on our experiments that evaluated the effects of spatial separation
in multivariate data presentations.


Towards an Auditory Representation of Complexity by Joachim Goßmann [PDF]

ABSTRACT 
In applications of sonification, the information inferred by the sonification strategy applied often supersedes the amount of information which can be retrieved by the ear about the object of sonification. This paper focuses on the representation of complex geometric formation through sound, drawing on the development of an interactive installation sonifying escape time fractals as an example. The terms “auditory emergence and formation” are introduced and an attempt is made to interpret them for music composition, data display and information theory. The example application,. “Audio Fraktal”, is a public installation in the permanent exhibition of the Museum for Media Art at ZKM, Karlsruhe. The design of the audiovisual display system that allows the shared experience of interactive spatial auditory formation is described. The work was produced by the author at the Institute for Music and Acoustics at ZKM, Karlsruhe.  



Methods for Visual Mining of Data in Virtual Reality by Henrik R. Nagel, Erik Granum, and Peter Musaeus [PDF]

ABSTRACT
Recent advances in technology have made it possible to use 3-D Virtual Reality for Visual Data Mining. This paper presents a modular system architecture with a series of tools for explorative analysis of large data sets in Virtual Reality. A 3-D Scatter Plot tool is extended to become an "Object Property Space", where data records are visualized as objects with as many statistical variables as possible represented as object properties like shape, color, etc. A working hypothesis is that the free and real-time navigation of the observer in the immersive virtual space will support the chances of nding interesting data structures and relationships. The system is now ready to be used for experiments to validate the hypothesis.



Listen to your Data: Model-Based Sonification for Data Analysis by T. Hermann and H. Ritter [PDF]

ABSTRACT
Sonification is the use of non-speech audio to convey information. We are developing tools for interactive data exploration, which make use of sonification for data presentation. In this paper, model-based sonification is presented as a concept to design auditory displays. Two designs are described: (1) particle trajectories in a “data potential” is a sonification model to reveal information about the clustering of vectorial data and (2) “data-sonograms” is a sonification for data from a classification problem to reveal information about the mixing of distinct classes.

Resulting insights and ideas will be summarized during the following days.




References


Beilharz, K. and Ferguson, S. (2009) An Interface and Framework Design for Interactive Aesthetic Sonification. Copenhagen: International Conference for Auditory Display.
Goßmann, J. (2005) Towards an Auditory Representation of Complexity. Limerick, Ireland: International Conference for Auditory Display.
Hermann, T. (2008) Taxonomy and Definitions for Sonification and Auditory Display. Paris: International Conference for Auditory Display.
Hermann, T. and Ritter, H. (n.d.) Listen to your Data: Model-Based Sonification for Data Analysis. Bielefeld: Department of Computer Science University of Bielefeld.
Nagel, H. et al. (n.d.) Methods for Visual Mining of Data in Virtual Reality. Aalborg, Denmark: Lab. of Computer Vision and Media Technology, Aalborg University, Denmark.