Multisensory data experiences
Big data leads quickly to information overload.
For handle big data, solution is very simple: buy bigger monitor and use smaller font in the terminal.— MySQL Borat (@mysqlborat) February 25, 2013
How can we deal with this?
I’ve recently been seeing a trend towards animated visual/audio data presentation experiences. I started toying with this a few years ago around Christmas time.
They’ve recently become more popular. My band made a music video of treasury data.
Küechenstudio made a podcast about some of the other recent advances in data music.
Combining data with music, specifically, may also appeal to a younger audience. This is because both data is “in”.
[S]tatisticians are the new sexy vampires, only even more pasty. (Emma Gertlowitz)
Recognizing this, the White House released to advertise the State of the Union Address. It uses pie charts and dubstep to appeal to a younger audience.
Data animations are also becoming popular in experimental physics, for both the ATLAS and CMS experiments.
Dynamic, multisensory data experiences will help us make sense of big data. In the long term, we really need to gastronomify data in order to experience them with all of the senses, but that isn’t feasible right now because of the reasonably high cost of food printers (three-dimensional printers).
Until we develop cheaper taste and smell APIs, we are stuck with what we have on our smartphones, laptops, &c., which is vision, hearing and touch. We need to make data music videos in order to make the most of these tools.
Brian Abelson and I gave a talk on this at the New York Open Statistical Programming Meetup. Here are my slides, my code demos, Brian’s slides and Brian’s data-driven rhythms package. A video of the talk is below.