Thursday, February 14, 2013

Project Idea

Combining all knowledge and facts resulting from the conducted research so far, a first concrete project idea is created.

At this point, the final project will be a virtual reality installation to view and experience possibly the entire worldbank database, especially by the means of data sonification. This will be achieved by importing data sets from the worldbank's database into the game engine Unity to create stereoscopic 3D visuals and graphs. By the use of the worldbank's Developer's API, possibly the entire database can be implemented into the project. Furthermore, sonification and sound design for the data will be created and implemented in Unity, using the FMOD sound engine. When using the installation at the 3D Lab at Hochschule Darmstadt, head-tracking in combination with a 7.1 surround headset by Logitech will be implemented to create an even better immersive binaural sound experience. The interface between a Vicon tracking system with Unity under the focus on creating an interactive head related transfer function has already been created and is ready to use. However, the installation will not be depended on the Vicon tracking system and will work with any surround or stereo sound system.



The visual representation of the data will be relatively straight forward, using standard visual data representation methods. A major focus will lie on the data sonification in combination with the visuals, focusing on not only to increase the immersion of the virtual reality installation, but most importantly to  increase work flow and usability of the installation, as well as the quality of the data representation. To combine classic working with data with the new audio-visual VR data installation, the installation will be controlled through a tablet device, so users are still able to view and compare hard numbers and sheets. So while being entirely immersed into the abstract data, users can always fall back to classic representations and numbers.




Sunday, February 10, 2013

Unity, XML and Particles

To continue the technical research on how to visualize big data sets in Unity (previous blog post), different visualization styles where examined. 

My project's supervisor Thorsten Fröhlich pointed me towards a brilliant tutorial for creating graphs in Unity with particles on 'Catlike Coding'. This tutorial is extremely detailed, really clear and absolutely stunning. It can be found here. Many thanks to the website's hoster Jasper Flick!

By combining my new ideas gained from experimenting with this tutorial with my latest Unity project which imported and read a large *.XML file from the worldbank's database, a beautiful particle cloud came up that can be explored:




Though this visual representation is simple eye-candy and not really useful or readable in any way, it is a good starting point on developing visual representations and graphs from the worldbank's XML files in Unity.


Since the sound engine FMOD is integrated in Unity and documented in detail, technical difficulties when creating sound design and sonification design will probably be minor, so in terms of sonification methods and design, the focus can still lie on the conceptual layer for now.



Time Schedule


To organize the tasks and activities, the execution phase of the master work has been divided into the five months. Two main activities have to be completed: the technical implementation of the data and the mobile app remote control in Unity and the sonification design for the data. The technical implementation will start with a single data set. At the end, possibly the entire data base of the Worldbank Institute will be implemented, a fallback point however is the implementation of several important data sets. Data visuals for Unity and the tablet device have to be developed, as well as the graphical interaction interface for the Unity application.
A version for a single machine with an own interaction GUI is also planned, but its implementation is depended on the progress of the project during the working phase. From the third month on, at least two usability test will be held to keep the work user orientated. Depending on the input and time needed to implement the results, the second usability test will be held at the beginning of the fourth or the fifth month.
Sonification and sound design will simultaneously be developed during the entire five months and evaluated in the usability testings twice, as this part has the main priority beside technical implementation.
A Gantt Chart has been created to clearly set the tasks and deadlines for each month.


Wednesday, February 6, 2013

Sonification: A Definition

The term 'Sonification' is quite abstract and not really known to a broad majority, even including most auto-correct software :-) It is often misunderstood or used in the wrong context. To get a clear understanding what sonification is and (even more important) what it is not, several definitions and articles talking about sonification in general have been studied. 
First of all, the not scientific but quite commonly used and beloved source Wikipedia describes sonification as "the use of non-speech audio to convey information or perceptualize data", referencing to one of the most famous books in the field by Gregory Kramer: "Auditory Display: Sonification, Audification, and Auditory Interfaces". This definition is fairly similar to the actual definition of Auditory Display itself, which basically is transporting information through sound.
The paper by Thomas Herrmann "Taxonomy and Definition for Sonification and Auditory Display" goes deeper into the term's meaning and tries to define 'Sonification' in more detail:
"A technique that uses data input, and generates sound signals (eventually in response to additional excitation or triggering) may be called sonification, if and only if
  • The Sound reflects objective properties or relations in the input data
  • The transformation is systematic. This means that there is a precise definition provided of how the data (and optional interactions) cause the sound to change
  • The sonification is reproducible: given the same data and identical interactions (or triggers) the resulting sound has to be structurally identical
  • The system can intentionally be used with different data, and also used in repetition with the same data."

This clearly underlines the mathematical aspect of sonification, stressing the fact the the term describes a scientific method and is indeed not an artisitc approach towards data:
"Being a scientific method, a prefix like in 'scientific sonification' is not necessary. Same as some data visualizations my be 'viewed' as art, sonifications may be heard as 'music', yet this use differs from the original intent."
This is a definite statements and clearly separates data sonification from many musical projects working with scientific data. Though the line between artistic music projects and actual data sonification projects might be a bit blurry. The difference a visual piece of art related to or created from scientific data and an actual visual graph the clearly represents numbers is similar to the difference between music and sonification and makes it easier to differentiate:
"This automatically distinguishes signification from music, where the purpose is not in the precise perception of what interactions are done with an instrument or what data caused the sound, but on an underlying artistic level that operates on a different level."
So though both, music and sonification belong to the family of 'organized sounds', they have different purposes and functions which clearly separates them from each other. This actually excludes musical data projects mentioned in a previous post from data sonification.

It is highly important to keep those definitions and differences in mind when creating and designing the sounds and interactions for a sonification project, as the danger to drift off into a more musical and artisticc approach, rather than a scientific sonification project during the creative process is quite high and has to be recognized rather sooner than later.


References 

KRAMER, G. (1994). Auditory display: sonification, audification, and auditory interfaces. Reading, Mass, Addison-Wesley.


Hermann, T. (2008) Taxonomy and Definitions for Sonification and Auditory Display. Paris: International Conference for Auditory Display.

Monday, February 4, 2013

Thomas Hermann

A very known researcher and developer in the area of Auditory Display and Data Sonification is Thomas Hermann. A lot of his research has been conducted at the University of Bielefeld, which covers the topics of sonification, human-computer interaction or cognitive interaction technology including a Ph.D. in computer science. He is furthermore a member of the ICAD Board of Directors.



Many of his papers are highly relevant for project's research and have already been studied, such as "Taxonomy and Definitions for Sonification and Auditory Display" or "Listen to your Data: Model-Based Sonification for Data Analysis" (see here).

His probably most important written work on sonification is the book "The Sonification Handbook",  published together with Andy Hunt and John G. Neuhoff and giving "a comprehensive introductory presentation of the key research areas in the interdisciplinary fields of sonification and auditory display".




The book can be purchased on Amazon and there is also a free download version available as PDF. The content of this book is vital to successfully create any sonification project and will take a major part in both, research and practical phase of the project.

Furthermore, Thomas Hermann is hoster of the website Sonification.de, giving a great overview to the topic of sonification and auditory display, as well as his personal details and research.



References

Amazon.de (n.d.) The Sonification Handbook. [online] Available at: http://www.amazon.de/The-Sonification-Handbook-Thomas-Hermann/dp/3832528199 [Accessed: 3 Feb 2013].
Sonification.de (2010) sonification.de. [online] Available at: http://sonification.de/ [Accessed: 3 Feb 2013].