Angry Birds firm handles massive growth with Riak NoSQL database

data management

Angry Birds firm handles massive growth with Riak NoSQL database

Antony Adshead

The company behind the Angry Birds game franchise, Finland-based Rovio Entertainment, has deployed the Riak open source distributed database from Basho to deal with masses of data associated with nearly two billion downloads from more than 250 million monthly users.

Rovio was formed in 2003 and has grown at a massive rate, with 700 employees. It has headquarters in Finland, the US, China, Sweden, Japan and the UK.

AngryBirds copy.jpg

The growth experienced by the company put enormous strain on its data capabilities. Rovio needed to ensure its high service levels could be maintained in a cost-effective way. To add further complexity, data transactions across multiple platforms, including smartphones and tablets, meant that investment was needed to keep the user experience consistent.

So, to deal with the high influx of data and peak load times, Rovio needed a database that could support viral growth without failing and causing downtime. Similarly, if demand was lower than anticipated, flexibility was needed to ensure infrastructure could be reined back, avoiding unnecessary expenditure.

Choosing the Riak NoSQL open source database

Rovio eventually deployed Basho’s scalable NoSQL open source distributed database, Riak, on Amazon Web Services (AWS). This has enabled it to economically and effectively manage increasing data volumes resulting from its growing number of data operations, said Ari Talja, lead server architect at Rovio.

“We started the design and the implementation of our cloud stack based on the huge figures we have for Angry Birds fans. We knew that we needed to be prepared for massive usage and be ready to scale up quickly when needed,” said Talja.

Riak was chosen, said Talja, because it fits well to Rovio’s requirements and supports diverse set of use cases. 

He listed the reasons the company chose the database: “Where Riak excels is, it does not lose data in any situation and therefore can be used to store critical data such as payment transactions; it’s horizontally scalable and supports extremely large and dynamic data sets; it has no single point of failure; it supports multi-datacentre deployments with real-time data replication; it is easy and robust to operate; it can be operated on any public and private cloud; and it has a good developer support programme.”

Flexible approach to managing data

Rovio’s IT team is now able to scale from tens of Riak servers to hundreds, based on customer demand. 

Rovio has benefited from Riak’s low-latency features and replication of data across multiple nodes within the database, providing a high tolerance for hardware failure without losing critical user data. If one node fails, the system maintains its performance profile. Multiple data formats are also supported.

Internal development has also become more streamlined since implementing Riak. A new in-house user interface, Bigbird Library, has been created on top of Riak to provide Rovio’s developers with a consistent and simple interface.

This makes it easy for developers to take new features into use or tweak existing use patterns when necessary, said Anna Koivu-Choo, lead server developer at Rovio.

“The Riak clusters have different kind of setups optimised for various use cases, as the needs of a cloud sync service differ quite a bit from those of an ad platform,” said Koivu-Choo. "Some of the clusters are used by a single service while others are shared. But all services use the same in-house abstraction layer, BigBird Library.”

Email Alerts

Register now to receive IT-related news, guides and more, delivered to your inbox.
By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

COMMENTS powered by Disqus  //  Commenting policy