Your company sells meat and cold cuts. It’s done pretty well in its 125-year history, but lately its share of the market is slipping. You need to think outside the box. Way outside the box.
What do you do? Find a way to wake people up with the smell of bacon so they’ll spend time thinking about your company’s products. That’s what Oscar Mayer did. The food giant created a small gadget that plugs into your iPhone and syncs with an app. The gadget makes sure that when your smartphone’s alarm goes off, the sound and smell of frying bacon fills the room. And your iPhone displays an image of steaming, crispy breakfast meat.
Here’s an ad Oscar Mayer made for the device:
http://www.youtube.com/watch?v=PiWdF3u9C0w
The bacon alarm shows one way that data is changing. It’s no surprise we “spot” trends in data and spend so much of our time “visualizing” it — at the moment, we rely on our eyes to tell us what’s important in a spreadsheet. The dependency on just one of our senses is limiting; it increases the chances that we’ll miss something.
Sound, smell, touch and taste could all be deployed to understand data better. Here are some of the ways people are using each:
Sound: Hatnote allows you to hear a sound each time an article on Wikipedia is created or changed. The Singing Bank goes one step further by changing a sound’s pitch according to how financial assets have changed.
Smell: There’s a watch that emits “a fragrance of coffee in the morning, the smell of money in the afternoon, a relaxing whisky scent in the evening, and a soothing chamomile fragrance at night.” It’s not hard to imagine it changing those fragrances in response to changes in a wearer’s heart rate or the number of steps she has walked in a day.
Touch: 12.5 million tweets about the 2012 Summer Olympics were analyzed and turned into sculptures that enable people to feel frequency.

I and J ideations
Taste: I thought this one would be hard to find, but I and J ideations proved me wrong. The team’s two Ph.D. scientists teamed up with a computer scientist and analyzed tweets about food. Then, in real time, they coded those tweets and gave them a flavor profile, which was transmitted to a drinks machine (above). People going to BevLab can literally taste the data.