The technology industry spin machine council has met and the findings of the committee agree that — scaremongering around whether Hadoop can translate to mainstream adoption is worth a punt on a story or two.
It is a bit of a recurring trend.
So how much reality is there in stories suggesting that many organisations have made significant Hadoop investments, but most projects are still experimental?
We have asked before, Is Hadoop hard?
You don’t have to look far to find a piece of industry comment with the keywords:
- big data
- Internet of Things
Or if you prefer the public relations enriched version (warning, not safe for children):
- analytics-enabled business processes to create transformational advantage
The problem we have here (according to CTO of Actian Corporation Mike Hoskins) is that legacy stacks just can’t scale adequately.
“While Hadoop is a tremendous step forward as pioneers drive business value from their big data assets, optimal adoption is limited to the privileged few with deep pockets and rarefied skills,” he said.
Has Hoskins got a neatly branded pre-loaded agenda up his sleeve to answer our Hadoop woes? Don’t even question it.
“Simplifying Hadoop and removing the reliance on dated MapReduce-driven approaches allows Actian to deliver ‘Big Data for the Rest of Us™.’ today”
See what he did there? A re-branding and humanising of big data, doesn’t that feel good?
The Hadoop exoskeleton
Sarcasm aside, Hoskins explains Actian’s commitment to industrialise Hadoop and says that what’s needed is a robust layer around Hadoop (like an exoskeleton) that accelerates all stages of a big data analytics job natively — from design time through execution processing to specialised analytic workloads.
As tools like Actian work to try and provide us with a visual drag-and-drop framework to load (plus transform) and analyze data in Hadoop with no coding will we see a happier Hadoop in the near future?