McKinsey and Gartner: Big data means high value, not just volume

The McKinsey Global Institute, Gartner and Teradata agree that big data’s value is more about variety than sheer volume.

The value of “big data” lies less in its volume than in its variety. This is the gist of a recent report from the McKinsey Global Institute, Big data: The next frontier for innovation, competition, and productivity*.

The strategy firm contends that analysing large data sets -- so-called big data -- will become a “key basis of competition, underpinning new waves of productivity growth, innovation and consumer surplus as long as the right policies and enablers are in place.”

Big data research so far to date is guilty of an over-concentration on volume, according to McKinsey: “Our study makes the case that the business and economic possibilities of big data and its wider implications are important issues.”

Business technology analyst firm Gartner makes a similar case, based on recent research. Yvonne Genovese, research vice president and distinguished analyst, told that “Big data is just the start. In the future, the full range of extreme information management issues -- of which volume is just one aspect -- will pose even greater challenges, but also enable the emergence of even more significant business opportunities.”

The McKinsey Global Institute study figures data as “an important factor of production, alongside labour and capital. We estimate that, by 2009, nearly all sectors in the US economy had at least an average of 200 terabytes of stored data per company with more than 1,000 employees” -- twice the size of US retailer Wal-Mart’s data warehouse in 1999.

The institute says value can be created from big data by “making information transparent and usable at much higher frequency; using sophisticated analytics to improve substantially decision making; and innovating the next generation of new business models, products and services.”

Open data from governments
One area the research focused on was the European public sector. Governments in 2011 are faced with maintaining a high level of public services while reducing large budget deficits. This is against a deeper background, says McKinsey, of declining productivity in the public sector that is the inverse image of private sector growth. MGI has found that so-called big data levers, such as increasing transparency and applying advanced analytics, can boost productivity: “Our research shows Europe’s public sector could potentially reduce the costs of administrative activities by 15% to 20%, creating the equivalent of €150 billion to €300 billion ($223 billion to $446 billion) -- or even higher -- in new value.”

McKinsey sees time-saving efficiencies in the pre-filling of forms for citizens and businesses from data already stored in government databases and vaunt the adoption of “open data” principles by which raw government databases are made available to the public. in the UK and the Aporta Web portal in Spain are examples.

Gartner’s most recent big data analysis is consonant with the McKinsey viewpoint. “By sweeping away current limitations derived from data constraints and exploiting a growing universe of existing enterprise data and publicly available data from external sources, a whole new era of digitally accelerated business models will emerge that has the potential for substantive new revenue and competitive advantage,” Genovese said.

On the vendors’ side of the house, Stephen Brobst, Teradata’s chief technology officer, has told us: “It’s not actually the size of the data that matters as much as the diversity. One important factor is the transition from transactions to interactions. This creates big data.”

* James Manyika and Michael Chui led the MGI research project, while academic advisers were Martin Baily, a McKinsey senior adviser and a senior fellow at the Brookings Institution, and Hal Varian, emeritus professor in the School of Information at the University of California at Berkeley.

Read more on Data warehousing