Big Data has hit the headlines in a big way over the past couple of years. It has perhaps become the biggest buzzword in supplier marketing-speak during that period too.
But, big data has come to particular prominence recently as a result of the activities of web big shots, like Facebook, Amazon and Apple.
And so, big data has come to mean deployment of near or real-time analytics on large volumes of data. All of which has implications for storage, which needs to be able to handle large capacities and provide rapid throughput.
This ComputerWeekly.com guide has articles that define big data, explain the key elements of big data storage environments, the storage implications of a shift to big data platforms such as Hadoop, the use of hyperscale compute and storage platforms for big data, and why big data isn’t just about storage.
Table of contents:
Defining big data storage
Big data storage: Defining big data and the type of storage it needs
Big datasets are not new. And neither is running analytics on large volumes of data. So, is there anything novel or different about big data as it has emerged in the past couple of years? In this podcast ComputerWeekly.com storage editor Antony Adshead defines what we mean by big data and the key storage infrastructure requirements it brings.
Big data storage news and analysis
Big data impact on storage infrastructure
What storage do you need for big data: SAN, scale-out NAS, object storage, or dedicated big data appliances? And what are the implications of each for your IT environment? In this video, Ovum senior analyst Tim Stammers discusses what to consider when looking at storage infrastructure for big data.
CW500: Thinking outside the big storage box
Hear how members of Computer Weekly’s 500 Club of CIO-level IT leaders have got to grips with big data. Robert White of investment bank Morgan Stanley and Sean Sadler of King’s College London offer practical tips on managing, processing and gaining value from big data.
Big data projects require big changes in hardware and software
Big data often means a shift to analytics platforms such as Hadoop, NoSQL and Cassandra, and it’s a world very different from existing enterprise storage systems. Find out the implications of the shift to big data for your compute and storage infrastructure in this article from SearchDataCenter.com.
Understanding stripped-down hyperscale storage for big data use cases
The most famous contemporary pioneers of big data are the likes of Facebook and Amazon, that analyse masses of customer data on-the-fly. But they don’t do it using conventional enterprise servers and storage. Instead, they use what is becoming known as hyperscale compute and storage infrastructures. These comprise stripped down processing and storage capability and a completely different way of providing IT. Find out more in this podcast interview with Wikibon analyst Dave Vellante.
Ordnance Survey gets to grips with geospatial big data
How the UK’s mapping agency, Ordnance Survey, built an infrastructure to get the most out of the vast amount of geospatial data it holds. Head of public sector, energy and infrastructure Mark Hobbell talks about the challenges faced, the environment it built and how Ordnance Survey gets business value by linking vast amounts of previously siloed data using its big data infrastructure.
King.com gaming site unlocks big data with Hadoop
Sweden-based provider King.com delivers games such as Bubble Witch Saga and Candy Crush Saga to tens of millions of Facebook users. In this article read how its big data analytics became too much for its existing MySQL-based systems and how it moved to a Hadoop infrastructure and the tools it uses for real-time analytics.
Government risks missing benefits of big data with focus on storage
Government and public sector need to focus on what it can gain from big data analytics and not get bogged down in the technology. That’s the conclusion of this report from analysts IDC.