Data is dumb, it’s what you do with it that’s smart (Or: Data is dead! Long live data!)

Data is dumb, it’s what you do with it that’s smart (Or: Data is dead! Long live data!)
In 2006, marketing commentator Michael Palmer wrote a post called ‘Data is the new oil’ on his blog. It read: “Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc., to create a valuable entity that drives profitable activity; so must data be broken down, analysed for it to have value.”

And thus, one of the great cliches of the post-bubble Silicon Valley was born. The more it was repeated, the more it was simplified beyond recognition. “Data is the new oil,” was taken to mean that data (the ‘bigger’ the better) was inherently valuable, no matter what was (or wasn’t) done with it.

But earlier this month at the Gartner Symposium, an annual gathering of thousands of CIOs and IT executives run by Gartner Inc, an IT research and advisory company known for their grand pronouncements on the latest buzzworthy trends in big tech, threw a little water on big data’s fire.

“Data is inherently dumb. It doesn’t actually do anything unless you know how to use it; how to act with it. Algorithms are where the real values lies. Algorithms define action. Dynamic algorithms are the core of new customer interaction.”

“Data is inherently dumb,” proclaimed Peter Sondergaard, head of research at Gartner. “It doesn’t actually do anything unless you know how to use it; how to act with it. Algorithms are where the real values lies. Algorithms define action. Dynamic algorithms are the core of new customer interaction.”

Sondergaard said that Amazon’s purchase recommendation algorithm, Netflix’s viewer algorithm (which not only keeps people watching by recommending TV shows, but gives the company a template for the creation of its original content), and Google’s algorithm which will help drive its driverless cars were examples of big data fading into the “algorithm economy”.

“The algorithmic economy will power the next great leap in machine-to-machine evolution in the Internet of Things,” he said. “Products and services will be defined by the sophistication of their algorithms and services. Organisations will be valued, not just on their big data, but the algorithms that turn that data into actions, and ultimately impact customers.”

But is this really new or just one buzzword replacing another?

Warren Ross, director of data analytics at the office of the auditor general, who presented a discussion on ‘After Big Data - What’s Next?’ at this year’s Big Data and Analytics conference in Auckland this August, says that although there have been major advancements in how we analyse data, this has always been where the value of data lies.

"Traditional sciences made use of empirical data to make observations about the world,” he says. “What we're doing now is not anything drastically new, other than we've got a lot more data than we're used to. Which means that the models we employ, and the ways we go about formulating those hypothesise, some of those processes have been influenced by having such large sets of data, and having data more freely available than has previously been possible."

He says people have always been trying to figure out what to do with set of information, whether it’s first hand empirical evidence or a whole warehouse's worth of data from the social media platform of your choice. The main difference, he says, is "big data sets are just too big and complex for people to look at and see anything useful. You have to use machines to do it."

“Having the oil in the ground is useless if you can't get it out and turn it into products. And who's going to do that? People are still doing that."

But, while the analytics work may be done by machines, you still need humans to create value out of whatever comes out of those machines.

“Having the oil in the ground is useless if you can't get it out and turn it into products,” he says, mirroring Palmer’s pronouncement from nearly a decade ago. “And who's going to do that? People are still doing that."

"The Obama campaign is always used an as example of the use of big data and analytics to tailor make custom messages for an audience, to try and sway swing voters one way or another,” he says. “But that's an example where conceivably, both campaigns had the same access to the same open data sets. They weren't generating any advantage in terms of the types of data or content of the data, but [Obama's campaign] certainly seemed to have an advantage in being able to interpret and understand that data and make some targeting decisions about it. But that's not anything particularly novel."

On his blog The Ad Contrarian, notorious ad man Bob Hofmann agrees, writing that “data is a frame, not a picture”:

"Just about the same data is available to just about everyone who wants it. Yahoo and Twitter and dozens of other online media companies have reams of data about you and know everything there is to know about you. And they’re still stuck in the mud. It’s not the data that makes the difference, it’s what you do with it.”