Skip to main content

Intersensory data representations of a black hole

Dr. Kyle Keane from the MIT Quest for Intelligence helps scientists tune into new ways of perceiving and communicating about blackholes through intersensory data representations.
  • Black background with irregular white circle, centered on graph lines, surrounding an image of a black hole.
    Simulations of the light echoes off of the accretion disk around a maximally spinning (“Kerr”) black hole. The white circle indicates the location of the black hole event horizon, and the echoes of light are color-coded by their observed frequency, which can be distorted by Doppler shifts and by the strong gravity of the black hole. The simulation has been sonified such that lower frequency light corresponds to a lower pitch sound.
    Animation computed by Michal Dovciak, ASU CAS.

Dr. Kyle Keane from the MIT Quest for Intelligence helps scientists tune into new ways of perceiving and communicating about blackholes through intersensory data representations. Dr. Keane recently made some noise to raise awareness about blackholes with Prof. Erin Kara (MIT Physics), Prof Ian Condry (CMS/W, Spatial Sound Lab at MIT), Michal Dorciak (ASU CAS), and Rook Murao (Audio Visual Artist and Researcher in the Spatial Sound Lab at MIT). Prof. Kara is a blackhole physicist who recently discovered eight new sources of black hole echoes using a light-based echolocation reverberation technique. Typically the data from this type of research is explored and communicated using data visualization techniques that produce visual representations that scientists are trained to interpret. The team decided to push the limits of interdisciplinary research and explore what might be possible if they worked together to understand how their highly diverse skills in music, physics, and learning might come together to explore the phenomena from a new perspective. 

Condry made the connection between Keane and Kara who then set off on a year long exploration of how to computationally convert Kara's light-based echo data into human-audible sound frequencies to produce a sonification. They iterated for months tweaking the algorithms and parameters going through hundreds of sound files until Kara said “I just love that we can ‘hear’ the general relativity in these simulations!" (as quoted by the New York Times )

Keane says "it is hard to know how to tell the story of how this all came together, it was just one of those magic moments that happens at the Spatial Sound Lab when you are sitting with a group of diverse curious people listening to experimental music talking about what might be possible, sometimes you just realize there is way in common with others than you might have ever imagined. As we talked more deeply it became clear that every member of the collaboration had a specialization, but they also had a diverse set of interests developed through previous experiences." Condry was searching around MIT for novel sounds to help support Murao's experimental live remixing AV performances when they came across Keane who told them about his research into intersensory scientific representations through data sonification and tactification.

Keane is an intersensory perception scientist who teaches Principles and Practices of Assistive Technology at MIT. Keane has Retinitis Pigmentosa, a degenerative genetic eye condition. Knowing that his eyesight was eventually going to become unreliable inspired him to begin studying data sonification, accessibility, and disability inclusion from the beginning of his career. Keane says "I am not sure I would have ever been willing to try to pursue a PhD in Physics if it were not for role models like Prof. Bruce Walker who runs the Sonifcation Lab at Georgia Tech and forward-thinking companies like Wolfram Research who opted to support extensive low vision accessibility in their programming environment." 

Keane hopes his work at the intersection of intelligence, disability advocacy, science, art, and design might help reveal new ways for systems that have natural and machine intelligence to learn from each other. Work in the fields of disability advocacy and assistive technology has broad impact to help find effective ways to communicate across modalities and to learn from the unique experiences and stories we all have to tell if someone can find a way to listen.