Two centuries is a long time to remain in business. Ordnance Survey has been a creator of maps since before Wellington’s victory at Waterloo. While the company’s legacy is one to be proud of, the nature of its market has changed, so the company, like the Iron Duke on that Belgian battlefield, has had to adapt its strategy.
While the company’s brand is well known for its quality paper maps, the business demand now is for accurate, accessible, digital spatial data.
Spatial data processing uses vast quantities of processing power, and not only is this a big data challenge, it has similarly weighty implications for the economy. Mapping information is vital to government and businesses.
Ordnance Survey has all but completed a five-year IT improvement programme to enhance its operations. That programme – with Oracle as the main IT partner – has already transformed those operations into an enterprise grid computing system that pulls 17 databases into one Oracle spatial database management platform.
The platform supports all geospatial data types and models. The system combines open source Linux with Oracle’s grid computing architecture, which makes it possible to coordinate large numbers of low-cost servers and corresponding storage so they operate like one large computer.
Availability and scalability
Ordnance Survey’s upgrade of operations has been driven by the growing and enduring business challenge to deliver accurate, up-to-date geographic data to customers. And that demands a database that is continuously available to service the company’s products. Clustering technology in the Ordnance Survey grid architecture enables the company to ramp up the system at any time – without the need for costly hardware upgrades.
Long-term cost of ownership is also an important issue. As Ordnance Survey can never predict how many customers or orders it is going to have, it needs flexibility to accommodate almost any volume of requests. Capacity is an ongoing concern, so scalability is always a crucial consideration.
So Ordnance Survey has created a single database with the flexibility to collect almost anything it wants for the design of new products. The database acts as a hub where Ordnance Survey can choose the components and items needed to create, launch and maintain new products. At the same time it streamlines the company’s maintenance of data quality – which keeps customers happy and future-proofs the business.
More on geospatial data
- Geospatial data could transform the way government does business
- Google launches cloud-based Google Earth Builder for geospatial enterprise data
- Geospatial mapping made simple with Microsoft SQL Server
Linked data technology
As Ordnance Survey approaches the end of the transformation of its operations, it is preparing its data to exploit the myriad interconnections that can exist between physical entities in what has been described as the “Internet of Things”. This web of interconnections between disparate objects and ideas is made possible through linked data technology.
Linked data assigns a unique tag – a three-fact, uniform resource identifier known as a triple – to each thing of interest. For example, population data can be linked to socio-economic statistics for a given town.
Publication of this type of linked data has grown into a Linked Data Web, currently estimated to include more than 30 billion triples, with some 20% of those having geographic content.
This capacity to identify everything, and to make it possible to interact with that identifying data, has immense potential. And the significant percentage of geographic data content means the opportunities for exploiting spatial information are huge.
The opportunities for exploiting spatial information are huge
Marc Hobell, Ordnance Survey
Data management challenges
Ordnance Survey’s story of a journey from a partitioned and complicated operation to a unified system is one that should resonate with many industries.
Energy companies, for example, are looking at embarking on a similar journey. They are at the start of a data revolution that will test severely their data management systems.
The smart metering programme that will put advanced metering into every home and business in the UK will create commercial opportunities for energy suppliers. But it will also generate data in quantities and at rates that energy companies have never before encountered. However, by upgrading their data management they can exploit this data to their advantage and to the benefit of their customers.
The first data management challenge for utilities will come with the logistics and planning involved in installing the new meters. The challenge will then continue with the torrents of information generated by smart metering once it is fully established in the UK.
Conventionally, utilities analyse data in batches. Smart metering will enable them to do so in real time and potentially generate volumes of data unprecedented in the sector. With that will come a data handling and storage challenge. And there will be the major security risk that comes with any project that generates masses of data which are moved around.
Many of Ordnance Survey’s data management experiences are replicable in the power and gas industries today, as well as public sector organisations. In most instances, these organisations have not unlocked the value of the data that they hold – often in silos – such as customer contacts, street works, asset management and so on. Geographic information and locational data can better equip them to meet these challenges.
The chief reward for Ordnance Survey from its review of data management has been in the increase in value of consistent and accurate data arising from seamless and flexible access to all categories of data. Ultimately, the gain to the business is in a greater capacity to meet customers’ current needs and to respond to emerging demands.
Marc Hobell (pictured) is Ordnance Survey’s head of public sector, energy and infrastructure