simon gurney - Fotolia

Taking a closer look at the Milton Keynes smart city project

The "MK: Smart" project took home the Best of Show prize at the recent VMworld Europe show in Barcelona – and this is why...

“Wouldn’t it be nice if all places were like Milton Keynes?” was the tagline for a series of adverts the town ran in the 1980s to convince people it was an appealing place to live.

Since then the population has grown rapidly, prompting the council to look for ways to ensure the local area’s infrastructure is not overloaded, that people can get around the place with ease and local resources are used as efficiently as possible.

This work has paved the way for the MK:Smart project, which recently won the Best of Show prize at the 2015 VMWorld Europe User Awards, in recognition of the transformational impact this project could make on the way UK towns and cities operate in the future.

The initiative is being led by distance learning pioneers, The Open University, and aims to collect data from around the city to make it more sustainable. It will analyse data on water, energy and transport use around the town.

The project is hosted at the Open University’s main campus in Milton Keynes, and backed by a failover site elsewhere in the town for disaster recovery purposes, says Julian Gilbert, a systems and network administrator at the Open University.

Core to the success of the project is the MK Data Hub, which The Open University and BT created to process the vast amounts of information generated by Milton Keynes’ smart city systems.

It provides an infrastructure for acquiring, managing and sharing multiple terabytes of data about the town’s energy and water consumption, transport, weather and pollution. The information is sourced from satellite technology, sensor networks and social and economic datasets.

Third-party developers and SMEs can access the data to create apps which help residents use the town’s resources.

“We need to ensure the data is constantly available for any project using the information feeds in their own application and for the people running those applications,” says Robbie Bays, systems and network administrator for the MK:Smart project.

“Our team needs historical data to be available so we can analyse trends.”

Data sources and business continuity

It’s an ever-evolving project, says Mathieu D’Aquin, data hub lead for MK:Smart. The range of places from which it draws data will grow over time, meaning it needs to be underpinned by a heterogeneous and scalable recovery management system. To fulfil this brief, the project team uses the Arcserve Unified Data Protection (UDP) system.

“At the moment, more than 70 sensor feeds are available on the hub, each including between one and 10 streams of the readings from as many sensors," says D'Aquin.

Read more about the Best of VMware Europe User Awards

“These sensors are looking at a variety of different things, from the occupancy of car parks, to the usage of recycling bins, the weather, soil moisture and water levels.”

There’s a genuine geographic spread to the project, so it’s not limited to particular section, says D’Aquin.

“The sensors are spread all over the city, supported by a variety of networking technologies, including Wi-Fi, GSM/3G and ultra-narrow band,” he says.

“A low-power wide-area network is being deployed using ultra narrow band, with a handful of base stations covering most of the Milton Keynes area, enabling a range of cheap, battery-powered sensors to connect to the MK Data Hub.”

Standing up the MK: Smart project

The data-gathering system uses virtualisation from VMware, but Gilbert says there are many companies involved in the project, including BT. The project uses an array of Dell Servers and Dot Hill SANs and HP switching technology.

The project needed careful thought in how the system hung together, as virtualisation was a key component, adds Gilbert.

“We needed to be an infrastructure that made it easy to spin-up virtual machines,” he says.

Cloud orchestration is handled by CloudStack, so project users can spin up virtual machines for research, while Arcserve’s technology covers disaster recovery.

Arcserve’s technology was chosen on the strength of its flexibility. “It gave us the ability to run both physical and virtual servers, as well as supporting both Windows and Linux. Windows is for the active directory side, while Linux is used to manage the sensors,” says Gilbert.

It also offered deduplication capabilities at source, allowing the organisation to cut down on the amount of storage it required from the start. This is a feature few others offer, according to Gilbert.

Data-gathering potential

As with many data-gathering initiatives, the key to success is how well that data can be gathered and processed, and the MK:Smart project is gathering pace in this area.

“The amount of data collected by the project is quickly increasing. Currently, close to 200 datasets are included, with many more being processed to reach the objective of several thousand datasets by end 2016. Each of these datasets represents different volumes, from small, statistical data, to time-granular streams generating GB of data per day,” says D’Aquin.

While the project is still in private beta, it is still too soon to assess efficiency savings, the project team says.

Once this phase of the project is complete, the MK Data Hub will be publicly accessible, says D’Aquin, but – at the moment – there is no requirement for the data to be available in real time.

“The data stored in this infrastructure won’t be confidential,” says Gilbert. He says there will be individual security and usage policies, according to the data gathered and who owns the dataset. These owners will also determine who has access.

“The accessibility of the data is however left under the control of the data provider, and the MK Data Hub already contains datasets with a range of access and usage policies, from open data to private datasets only accessible to their owners.

Given that the project is still a beta trial, the full implications of the MK:Smart Project remain to be seen – but it is already attracting interest from other parts of the country.

“I understand this is the first of this kind,” says Gilbert, but other people are looking at how the project pans out, he adds. “We’re certainly very happy with it so far.”

Also on the agenda for the project – which is earmarked for completion in two years’ time – is the introduction of big data capabilities for better analysis of the data, says Gilbert.

Read more on Datacentre disaster recovery and security