In-memory database technology puts Mitie and Wooga in the fast lane

Case study

In-memory database technology puts Mitie and Wooga in the fast lane

It takes years to train top accountants. But how much time do they spend waiting for data to load? This is a question Mitie, a £2bn business services and outsourcing firm, has been asking itself.

By looking at the time it was taking to analyse company finances using spreadsheets and other reporting tools, the firm reasoned it would be worth investing in a better approach, says Edward Callaghan, Mitie's finance director for London and South-East England.

140505_0488.jpg

As a result, the company is at the proof of concept stage in developing a reporting system based on IBM Cognos running on a DB2 Blu in-memory database taking data from Oracle Financials.

These systems store the entire database structure in-memory and access the database directly, without the use of input/output instructions, improving application performance. 

Although Callaghan would not reveal the proposed investment, it could be cheap compared with accountants’ time, some of which is currently spent extracting reports from Mitie’s Oracle Financials system.

“You can look at the number of extracted reports out of Oracle and the duration each takes to download and measure the total time from raw data to final output in our current approach. We’re looking to release that time into doing more valuable work,” he says.

Phase two of the project proposes to link Oracle Project Accounting to the in-memory database. Contract managers currently produce analysis for meetings, but will not be able to answer follow-up questions until sometime afterwards.

In the south-east, Mitie manages about 140,000 jobs per year, while each manager could look after 100 contracts. Using in-memory analytics hosted on IBM Power 8 servers, Callaghan says project managers could answer queries during meetings making them more productive and reducing the time it takes to make important decisions.

Mitie also plans to exploit in-memory technology in asset management and capture information from mobile devices on site, increasing the data available for analysis to make better management decisions on the fly. But the first project focuses on improving financial transparency and is set to start in the next quarter.

Mitie provides an example of moving to in-memory database technology to improve data exploration. Such tools come from companies like Qlikview, Tableau, Teradata, MicroStrategy and Exasol among others. 

Case study: Developer ups its game with Exasol

In mobile and social gaming, players are fickle creatures. Too easy, and they get bored and move to another game. Too hard, and they give up and do the same. For developers, keeping players engaged is a constant challenge requiring continual improvements to each game.

For Berlin startup Wooga this means structuring the business around each game so designs can rapidly respond to behaviour of the 35 million users who log in to its games each month. The firm collects about 200GB of data per day from its players, in the form of game events codified in HTTP requests.

Founded in 2009, the business built its systems around open source software and used MySQL for analytics. But it soon reached the limitations of these technologies.

Markus Steinkamp, Wooga's head of business intelligence (BI), says: “You came in in the morning, put some queries in SQL and after lunch got the results. That is not the way to push analytics in this company.”

Simple calculations, such as finding the average game time, could be grindingly onerous. Calculating the mean is easy, but not useful, because it can be skewed by people who log out after a few seconds. The median is a more powerful average, but is painful to calculate without in-memory technology, Steinkamp says. 

“You need the whole results set to sort the data and find the median. You have to put the whole result somewhere. It is better to do it in memory; otherwise you have to swap the results on to disk and re-read it - it is too painful.”

Wooga deployed in-memory database and analytics tools from Exasol, hosted in the cloud in the vendor’s datacentre.

The game developer eschewed the opportunity to buy the technology from larger suppliers, partly for technical and partly for cultural reasons.

“Buying a solution is quite rare at Wooga. But Exasol is quite a small company - we can talk to them and we know them. A solution from IBM, SAP or Oracle is completely different - you have to talk to the sales organisation and the prices are far higher. We talked to SAP and we were able to use in-memory tool Hana for a year at no cost but the whole culture was too different,” says Steinkamp.

While a central BI unit provides technology support, in-memory technology allows analytics to be devolved to where decisions about game modification are made, Steinkamp says.

“The games teams have every function - engineers, product managers, game design, data analysts and art people. They have weekly meetings for new features and the analyst is quite important in that. They have to say, ‘This feature has the potential to increase retention by X,’ but the product lead might ask questions and they need the answer the same day, not three days later. Data analysts are hands on and independent. If you have a central team then you have dependencies and we don’t like that.”

As well as providing benefits in traditional businesses, startup businesses such as online gaming firm Wooga are using in-memory data exploration to bring data analysts closer to business teams (see case study, left), improving the speed and efficiency of decision making.

The early history of in-memory

This is a far cry from early applications of in-memory technology. Martha Bennett, principal analyst with Forrester Research, recalls this was in financial services, where algorithms could be applied to live trading data in real time

In these markets, which have used such technology since the mid-2000s, even small improvements in the performance of financial trades could justify the costs of holding large amounts of data in solid state storage.

These technologies, which allow pre-programmed algorithms to make decisions on live data, are now being deployed more widely in other applications where speed is important. 

In e-commerce, they can help with dynamic pricing, while in online advertising they can assist ad placement. This application-specific approach is also extended to making decisions on live data in ERP or supply chain systems, as suppliers such as SAP and Oracle adopt the technology.

Bennett says the performance of in-memory is now providing benefits for a wide range of businesses and use cases: “For some businesses it is very important. It sounds like a cliché, but business is moving faster and competitive advantage does lies in your ability to do more with data more quickly.”

But users need to get to grips with data basics before they can begin to exploit these new technologies, she says.

“A lot of organisations really don’t have a good handle on their data. They have issues with who owns the data, they don’t know what rules to apply or how long it is kept. The companies that are moving ahead [with in-memory] are the ones where the business not only understands the data but is also prepared to take ownership of it. That is a cultural thing.”

Another pitfall in the application of in-memory technology comes when applying algorithmic decision-making on live data, Bennett says.

You need to understand what they are doing and you need to understand your business. You need to watch what is going on because an algorithm deteriorates. Do you know what it is going to do? How are you going maintain it? A lot companies do not understand this,” she says.

With the application of algorithms to in-memory data, businesses are in danger of “getting the train wreck faster” but there is also the opportunity to experiment in shorter cycles and discover which algorithms work best, if done in a controlled manner, Bennett says.  

In-memory analytics vs the traditional data warehouse

Gartner forecasts widespread adoption of in-memory database technology in the coming years, not only speeding up existing applications but also supporting new business opportunities.

But Gartner research director for information management, Roxane Edjlali, says in-memory technology will not replace the data warehouse by performing all analytics as part of the application stack.

“You can do transactional analytics in-memory but this does not remove the need for the data warehouse in the next few years. Firstly, managing all this data in-memory can be expensive, and may not make sense. Not all business applications will be candidates to move in-memory. You are likely to see a mixed environment," she says.

“Secondly, you bring semantic consistency and cleanse data by moving it to a data warehouse. This process needs to happen somewhere. The result of having multiple apps running analytics is you do not have consistent data between them.”

In some cases this might make sense, but organisations will still need to achieve consistency of their data at some point, she says.

The benefit of in-memory databases is speed. Some applications need it and some do not. But the technology is also allowing businesses to change the way they work and put analytics closer to the decision-making coalface.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in July 2014

 

COMMENTS powered by Disqus  //  Commenting policy