In the world of Big Data, visualization rules. Condensing a large amount of information into a visually appealing and simple product is often the most effective way to communicate the conclusions that scientists, journalists and businesses draw from ever more unwieldy data sets.
But with the ever-growing arena of devices that gather data by interacting with our other senses comes the opportunity to think about data in different ways. In an article over at Design Mind, author Eric Boam explores some of the projects being developed by artists and scientists that utilize smell, hearing and taste to communicate complicated ideas.
This diversity of sensory experience raises the question: why not tap into secondary senses to present data more effectively? Adding sound, smell, taste, or touch to sight would expand the intensity of data experience, and likely create more nuance, with more impact than any single mode of representation.
Just as sight gives us color, shape, size, brightness, and space to work with, our other senses also offer an array of variables with which we might represent varied aspects of data. With sound, there is pitch, tone, volume, frequency, and rhythm. With touch there is texture, weight, pressure, temperature, and materiality. Our senses of smell and taste are closely linked but we can still use flavor and scent both independently as well as together.
The projects include the sounds of volcanoes before eruption and a wearable device that provides smell of certain foods accompanied by a tasteless and odorless food analog that deeply explores the connection between senses.
As we move into the extra-visual era of data representation, it is important to remember that the goal is not simply to find the best alternative or complement to visualization. Rather, the ideal is to experience the data more richly. This means that anyone can take a data set and begin to map the parameters to different sensory modes, exploring the data and uncovering new insights.
You can read the full article here.