Like it or not, the amount of information an organisation will deal with only ever increases, writes Alan Bowling, chairman of the UK & Ireland SAP User Group. This has led to the concept of "Big Data", whereby organisations will increasingly rely on large amounts of information from a variety of sources to analyse, improve and execute their operations.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
There are several reasons given for this. First, simple availability: the use of more technology in business is resulting in more and more data. Second, regulation: organisations must retain more and more information to prove compliance. Finally, there is an increasing recognition that organisations must use every single resource at their disposal. As a result, data that once might have seemed irrelevant is now pored over for any perceived value.
Uses for Big Data
The big question for Big Data is what to do with it. Most organisations will naturally want to carry out in-depth analysis of the data within their ERP systems, digging deep to analyse and predict the most effective way to do business and determine future strategies and tactics. However, as data volumes increase, so organisations hit a stumbling block: how do they process such a huge amount of information in a timely manner?
One option is to only study a proportion of the whole mass, yet this can easily provide inaccurate results as organisations are basing their decisions on an incomplete view. With enough computing power, this obstacle is removed as organisations then have the performance for high-speed analysis of entire masses of data at once. In-memory computing tools, such as SAP's HANA In-Memory appliance, are designed to produce this power so that organisations can analyse vast quantities of business data as and when it is received and needed, from a variety of data sources.
In-memory: not like a sieve
The concept behind in-memory computing is relatively simple. Traditionally, data will be placed in storage then, when needed, will be accessed and acted upon in the computer's memory. This results in a natural bottleneck that reduces speed - even with the fastest SSD hard drives, there will still be a gap where data must be accessed, transferred to memory, and then returned so the next batch of data can be used. As volumes of data increase, so the time needed simply for access, let alone actual analysis, increases too.
In-memory computing takes advantage of a better understanding of how data is shaped and stored, the constantly falling price of memory and the related greater affordability of faster solid state memory to do away with the traditional concept of storage. Instead, data is stored directly in the computer's memory. As a result, when it needs to be analysed it is already available and can be accessed near-instantaneously.
The most evident benefit of in-memory processing is its speed. Without the bottleneck of having to access data in storage, organisations can swiftly analyse information and use it to create the best possible strategies.
This speed is vital; rather than analysing information that is days or weeks out of date, organisations can perform complex queries in minutes, meaning their business operations can be investigated and improved based on the situation as it is rather than the situation as it was last week. At the same time, in-memory computing's power means that organisations can investigate entire sets of data rather than representative samples, meaning they can be sure they are acting on all of the facts.
This power and speed provides other benefits. Rather than trying to streamline analysis speeds by presenting data in a rigid format that only responds to certain pre-ordained queries, organisations can instead save data in a more unstructured format. By relying on the power of in-memory computing to compensate for this lack of structure, organisations also have far more flexibility in how they access the information.
For example, if an organisation using in-memory tools suddenly decides to study its HR processes based on new customer feedback data, it does not need to restructure the data on file to accommodate a planned selection of new queries. It simply asks the questions as and when they appear. It is these benefits that mean in-memory is already used by many organisations, for purposes from maximising sales to analysing gene sequences.
Taking the plunge
To an extent, the decision to adopt in-memory computing is less one of "whether" and more one of "when". If an organisation is large enough and collects enough information, the inevitability of Big Data means that it will have to adopt in-memory computing at some point so it can continue to function.
For certain sectors where huge amounts of data are practically a requirement, such as utilities or finance, in-memory computing is already a hugely disruptive technology. Organisations in these sectors would do well to make the move to in-memory computing early, rather than being left trying to catch up with the competition. For others, the choice is less clear-cut. An organisation with relatively little data may feel the costs of an in-memory implementation far outweigh the benefits.
What is clear is that the move to in-memory computing, while it might be inevitable, will not necessarily be straightforward. Organisations should take advantage of all sources of information at their disposal, from suppliers to user groups, to help them make their decision. Whether the best decision is to implement in-memory now, in the future, in-house, via the cloud, or simply not at all, organisations need to be sure they have made a well-informed choice. This choice also needs to cover the most important factor of all - as powerful as in-memory computing is, like all technology it is worse than useless if it is not used to the correct end.
Alan Bowling is chairman of the UK & Ireland SAP User Group.
- 3PAR Storage: Tailor-made for virtual infrastructures
- Whitepaper: Cut the cost of a tech refresh
- Whitepaper: Maximise virtual desktop ROI without risking service levels through consolidation
- Whitepaper: Building a simpler, more flexible and responsive IT infrastructure
- Whitepaper: Capitalising on content - A compelling ROI for change
Cloud-based in-memory computing solves storage issue