In-memory analytics: BI innovation for 2011

In-memory analytics is being touted as one of the top developments in BI for 2011. Let’s take a look at what all the in-memory analytics buildup is about.

With the evolution of reporting and analytics, business intelligence (BI) has seen the advent of service-based architectures. In-memory analytics can be viewed as the next step of this evolutionary scale. We try to explore what makes in-memory analytics the next big thing in BI innovation.

Reduced dependence on information technology (IT)

The preliminary task any in-memory analytical tool aims to accomplish is the elimination of hard disk-based BI reporting. By doing so, it increases flexibility and scope of the analysis through easy accessibility of the data and a faster response time.

As the name suggests, in-memory analytics involves first loading the data on the memory. This may be server-based where an administrator would have to enable the loading or a desktop analysis tool, where the user itself can load the data. This varies from the typical query run against a data warehouse, where it is generated by reading various indices on the server’s hard disk. In-memory analytics reduces the business user’s dependence on IT personnel.

Another attribute to the buildup of in-memory analytics is the reduction in the prices of storage hardware. Systems having a 64-bit operating system and 1 terabyte of addressable memory can support multiple data marts or even an entire data warehouse. But this is the proposition suggested by the vendors of the tool. Vinod Shankar, head of BI at activecubes says, “In actuality only 200 gigabytes of memory can be optimized to querying. People need to undertake a proof of concept to examine all the facts.” This raises the question if this BI innovation is actually as good as the package label suggests.

Variable uses across organizations

In-memory analytics is usually a sub-set of analytical tools. As the vendors propagate these, they are now counting on the in-memory analytics features to sell their products. No doubt the technology is worth the talk as it is scalable and leverages quick data querying.The use of in-memory analytics varies from small to large organizations. The smaller organizations may use it for all their querying purposes as they may not have vast amount of data. On the other hand, larger organizations may use in-memory analytics to empower their power users. Self-service BI, which is also being offered as a feature in many BI solutions is closely inter-connected with this. Anand Sam, head - Innovation at activecubes explains, “While the traditional querying solutions will grow stronger, the distinct flavors of in-memory analytics may be sought to cater to the different needs of the organizations.”

Buying guide for in-memory analytics

Before purchasing an in-memory analytical tool, the following criteria ought to be considered:

  • The business requirement.
  • The capability of the tool to match the business requirement.
  • Evaluating various products available in the market based on the features offered by them.
  • The cost factor. Each feature of a product should be scored according to its cost effectiveness for clarity of results.
  • Stability of the vendor and the service offered by him.
  • Integration of the in-memory analytical tool with the rest of the BI environment (the in-memory analytics offered by the vendor that has constituted your BI environment would be ideal).

A delicate balance exists between hype and reality; thus, a proof of concept is crucial before adopting any technology. The flaws of in-memory analytics will surely be fixed as the adoption increases and as big players acquire small companies to strengthen their BI solutions.

Read more on Business intelligence and analytics