Lost in the global cesspit of Big Data

IT events calenders are awash with events on the wonders of “Big Data” but an article by Craig Stedman headed “IT teams take steps to simplify Big Data analytics process” caused me to wonder how far the world has moved on since I spent a year as a financial modelling consultant in the “Management Science” team of ICL in the mid 1970s. Most of those who then wanted big complex computer models to crunch their growing files of data were unaware that what they were looking for could be done quickly, simply and more usefully with pencil, paper and a slide rule

I recently attended a meeting on the wonders of using “big data” to help serve “smart cities”. The “impressive” example from New York  was something that could have been done using 1930s punch card technology. The break through had been to get departments to share sensitive information on problems they were shy of discussing with others.

That raises the core question of why departments are so resistant to data sharing – commonly using “data protection” as a mantra. Meanwhile, for example, patients are dying in hundreds, possibly thousands, across our fragmented “National” Health Services because of the errors, delays and conflicts that occur when data, like care, is parceled out among specialists.

So why is there so much resistance to sharing when it is in the interest of the customer, patient or resident (but not when it is being quietly sold to an advertiser on the other side of the world)?

In rough order the top three reasons appear to be:

  1. Fear that others will learn just how poor (quality, accuracy etc.) our own data is
  2. The inability to agree common terminology
  3. Fear that others will abuse “our” data

All three are commonly valid and need to be addressed sensitively, intelligently and constructively because “force” tends to produce unfortunate results – e.g. common terminologies that are vague, ambiguous or misleading when clarity and precision matter most,

That set me to wonder just how much (little) the world has moved on since Philip Dunne MP (then Parliamentary Chairman of the EURIM Information Governance Group) hosted a Directors Round Table on the problems that arise across public and private sectors from the failure to put accuracy and availability ahead of protection. Almost all the material tabled for that event (and archived on the website) has, unfortunately, stood the test of time. One of the unfortunate side effects of the end of the Audit Commission was the end of its work on data quality across the public sector. A snapshot of the  subsequent EURIM work programme can be found on the website archived just after I retired. My personal favourite was the study for which the one page summary was headed “From Toxic Liability to Strategy Asset: unlocking the value of information “.

Today we see the opposite approach: calls from every corner to collect as much as possible to be mashed up in a Big Data slurry pit, using ever more expensive technology. Meanwhile one of the world’s most respected analytics companies puts the skills of its people and their ability to identify accurate and relevant sources first.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Very interesting, thanks for sharing. Now more than ever we see companies getting caught in the trappings of big data and ineffective BI use, and lose sight of solutions that are really quite simple. Like anything, good BI is quality over quantity, analytics driven by a cohesive strategy that aligns with the companies business goals.
Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close