<
scroll

Auralising data - listening to the universe

The challenge

Determining patterns in large datasets is difficult. Data visualisations are commonly used to assist interpretation. Typically, we rely on sight to help us understand data insights. What new insights into data can sound offer? We worked on a sonification of the universe to help us find out.

In this article

  • Sonifying the sound of galaxies merging for the British Science Festival
  • The process of data sonification, and what it can teach us about data interpretation
  • Listen to our sonification of stars appearing

A toolkit designed to help us ‘listen’ to data is being developed for Arup SoundLabs around the world. According to Sydney-based sound designer, Mitchell Allen, the process of ‘sonification’ offers a new way to interact with data and experience its potential range of meanings.

 

Unlike sight, which tends to dominate our attention, hearing carries on in the background twenty-four hours a day, seven days a week. The quiet strength of audio offers-interesting possibilities when it comes to analysis of large or continuous datasets through data sonification -  turning data into sound as a way to identify and understand patterns.

Listening allows us to pick up rhythmic elements, to interrogate and diagnose data in ways that can lead to new conclusions, and it allows us to peripherally monitor data more easily and efficiently than sight. - Mitch Allen

 

One example is the condensing of massive amounts of data into short, immersive experiences, such as some of the content used in Beyond Perception. Another is peripheral monitoring, like the ‘beep’ of a heartbeat monitor that allows a surgeon to track the wellbeing of the patient whilst they concentrate intently on the separate task of the operation.

 

Data sonification also offers accessibility to a wider cohort – particularly the estimated 575,000 Australians and 250 million people globally with moderate to vision impairment.

 

But easily accessible tools for data sonification are relatively rare. So, when he was asked to work on ‘A dark tour of the universe’ for the British Science Festival, Mitch jumped at the chance to use the project as an opportunity to create a toolkit for sonification.

 

“We wanted to move to a point where our digital teams, or anyone else, could interrogate data sets, to be able to plug-in and get different perspective on their data. We developed the toolkit using the British Science Festival as a proof of concept.”

 

Working with astrophysicist Chris Harrison and astronomer Nicholas Bonne - who is visually impaired himself – the team set about turning astrophysics datasets from space into sound, with a specific focus on communicating the concepts effectively for people who are blind or visually impaired. This focus meant that close attention had to be paid to the nature of the sounds they created being fit for purpose’.

 

Mitch worked closely with Kim Jones and vacation student Kashlin McCutcheon to sonify the data. “It helped that Kit [Kashlin] has a background in astrophysics and music production as he understood both the data and how to implement it aurally.”

3D models created by Nicolas Bonne. Photo © ESO/ M. Zamani

The team sonified nine concepts, including stars appearing in the night sky, a solar system orbit, galaxies merging and dark matter. For the 'stars appearing' element, they used a glockenspiel to represent and convey stars appearing in the night sky as they would at ESO observatory on the night of the British Science Festival, using time to describe relative brightness: the brighter the star the sooner they appear in the night sky, and visa versa.

 

They sent versions to Chris and Nicholas who provided feedback on where tweaks were needed. For example, at one point, Chris said they were not conveying variable star brightness enough, so they incorporated white noise and an additional filter to make the sound more relevant for the concept being conveyed.

The merging galaxies, stars appearing and Trappist sounded particularly amazing inside the circle of speakers.... The highlight of the day was towards the end when 14 people from Coventry Resource Centre for the Blind came all together in a bus to hear and touch  (not ’see’) the show. - Chris Harrison

If you would like to listen to the stars, click here.  The sounds represent each star, with the brightest stars appear first, and the pitch of the sound represents the colour of the star.  The position of the star is also to the real position on the night sky. Time has been sped up.

Binaural recording session at the Sydney office soundlab.

The team expanded their collaboration by checking-in with Arup colleagues from our various offices to make sure they were combining the existing global knowledge base on the subject. Luckily, there was plenty to draw on - Shane Myrbeck  worked on NASA Orbit Pavilion and the Light House for the Blind and Visually Impaired, while Simon Jackson helped to develop COSMOS, and Terence Caulkins worked on Hubble Cantata.

A TOOLKIT FOR DATA SONIFICATION 

To develop the data sonification toolkit, the team reviewed existing digital tools and selected a way forward based on technology used in our SoundLabs.

 

The SoundLabs are dedicated listening spaces, acoustically isolated and treated with spatial sound systems installed so auralisations can be created as part of project delivery, because listening to designs communicates far more accessibly that technical reports for things like concert halls or noise mitigation options. - Mitch Allen

 

Leveraging existing infrastructure, the team built a series of bespoke scripts - or patches – to plug the data into the SoundLab and sonify it. The proof of concept was a success, working well on the complex astrophysics data sets. The next steps will be to further testing and refinement to see if the concept can be extended to create a standardised data sonification toolkit.

The team's data visualisation for stars appearing in the sky.

“The idea is that anyone from any discipline at Arup that uses data analysis in their work, could walk into a SoundLab, plug in a set of data, press play and 'listen' to the data, rather than look at it. Hopefully it will provide an alternative perspective on data analysis and problem solving.”

In conjunction with the toolkit work, the BSF work with Chris and Nicholas may be extended to provide soundscapes for planetariums around the world. And ultimately, other market applications may emerge, such as ongoing monitoring systems for traffic operation centres and healthcare environments. Once we open our ears to the data, who knows what we will hear?

Findings

  • Complex data sets can be sonified through the Sound Lab, eliciting new insights and understanding
  • Data sonification has potential application for transport planning, health and education sectors

Lead Arup Researcher

Mitchell Allen
Mitch is a sound specialist in our Acoustics team.

Ask Mitch about:

  • Soundscape Design
  • Arts and Engineering
  • Auralisation
  • Environmental / Transport Infrastructure Noise

LEAD Partner RESEARCHER

Chris Harrison
Chris is research fellow at the European Southern Observatory in Garching-Munich. His research area are observational astronomy, galaxy dynamics, and supermassive black holes.

Research TEAM

Simon
Bone
Senior Developer, Digital, Sydney office
Kim
Jones
Consultant, Acoustics, Sydney office

Have a problem or a project?

We work with industry partners, governments, universities, startups and community organisations. We do this through research partnerships, and as consultants and facilitators for foresight, research, storytelling and technical writing workshops.

research with us