SAP has predictably saved up news relating to its HANA product for this week’s TechEd show here in Las Vegas, but what is it? Described variously as the company’s in-memory computing platform with a strong emphasis on ERP, it is perhaps more directly described as a high-performance analytical appliance with a supporting software and a hardware consideration to match.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
SAP CTO Vishal Sikka said some time back now that HANA would be central to the company’s future plans; indeed, this week has seen the revered Stanford doctor talk effusively about the product. Notwithstanding the fact that he unable to talk about HANA without repeatedly mentioning the words “innovation”, “future” and “innovation” (sorry was that innovation twice?), SAP has clearly polished its baby carefully before this public outing.
This week sees two new solutions built on HANA, namely SAP Smart Meter Analytics software built on HANA and SAP COPA Accelerator software. Billing these products as applications designed to give users real-time insight into “big data”, SAP is vying for a cemented position in the data analysis market with this offering.
As SAP puts it, this is, “Analysis, planning, forecasting and simulations in a fluid, natural way versus traditional approaches that are rigid, sequential and time-consuming.”
But HANA may end up being much more important for SAP than it is even showing us now, so could this product really be the company’s ace in the hole?
According to Forrester Research, SAP has emerged as the leading advocate of in-memory computing technology as a key pillar of its innovation strategy: “[SAP] HANA allows SAP to develop innovative new applications that can consume and analyse massive volumes of data in near real time while also providing a cost-effective, elastic computing platform.”
HANA’s future may see it rise to even loftier heights. HANA uses a slender columnar structure as opposed to the more complex ‘star schema’ employed by relational databases.
Essentially it is an in-memory database with a flat two-dimensional structure built to take advantage of modern processing power and memory capabilities. Amit Sinha, SAP’s vice president of in-memory computing and HANA solution marketing has suggested that relational databases today employ comparably archaic techniques shaped by hardware and software limitations dating back as far as two decades. HANA, arguably, moves us forward…
Where this puts HANA for the future is potentially in a position of power. SAP would clearly love to “positively disrupt” the database market and send shivers up Oracle’s spine — the fact that it now has Sybase under its increasingly albatross-like corporate wing doesn’t do it any harm at all here either does it?
“There is a massive simplification happening all around us. Layers are being dissolved at an unbelievable pace; people, businesses, data and machines are becoming more directly connected. This virtuous cycle of connectedness leads to disintermediation of layers, which drives end-users to become more empowered and demand better user-experience — challenging us to create more connectedness,” said SAP CTO Vishal Sikka.
SAP doesn’t shirk at suggestions of massive change in the data market. The company is happy to suggest that boundaries between the application layer and the database layer are dissolving.
Sorry – I forgot to mention, when Sikka is not saying “innovation and the future”, he can usually be found happily repeating “disintermediation of layers” until fade. I do the gentleman a disservice; he is an entertaining speaker and true intellectual.
So as these layers dissolve, which to some degree they surely must, we will see some of the work (i.e. the calculations) previously performed by the application layer being executed at the database layer. Remember what Forrester said? “SAP has emerged as the leading advocate of in-memory computing technology…”
So how does it work?
In practice, this in-memory computing means data doesn’t have to travel between so many layers inside the IT stack, this means calculations are executed faster, this means real-world transactional workloads happen faster, this allows accurate data intensive analytics to happen faster, this means corporate information dashboards are updated faster, this means executives can take actions faster, this means companies are glad they used SAP technology.
Well, that’s the theory – and the company appears to be cooking up a strong enough argument.
Sure SAP will face competition from Oracle, IBM, and Microsoft who are all working to develop their own in-memory databases technologies. But if we have to follow the scent of a leader just now; it just might have to be SAP for the time being.
An ace in the hole? Maybe. Strongest hand at the table? It’s not for me to say. Worth a gamble? Well, this is Las Vegas after all…