One thing is clear thing about the impact of the internet of things (IoT) on storage – there can be no single method to deal with the data generated by things, not least because there is such a wide variety of data generators and data types.
It is also clear that data volumes are large and growing, that the datacentre needs to adapt to deal with them, and that cloud-based storage may be one solution – but not the only one.
And security is just one of the many other challenges the IoT presents.
IoT data can broadly be divided into two extreme types. At one end of the spectrum are the likes of large audiovisual files, such as those generated by surveillance cameras. At the other are the host of tiny log files generated by environmental sensors and the like.
The input/output (I/O) profiles of each data type, in terms of reading and writing, differ so much that it is not realistic to design a one-size-fits-all IoT storage architecture if the data generated includes both types.
Examples of IoT diversity are not hard to find. A leading application is the smart city, which contains a range of functions – including environment monitoring, video surveillance and traffic management – such as the one in Milton Keynes, UK, which incorporates applications such as smart drones and has hosted driverless car tests.
Elsewhere, using IoT-derived data, Ericsson’s Maritime ICT Cloud automates the process of manually updating traffic, cargo, port, weather and safety information for shipping companies. It connects vessels at sea with shore-based operations, maintenance service providers, customer support centres, fleet/transportation partners, port operations and authorities.
Meanwhile, German car manufacturer Daimler uses IoT data to automate safety procedures when its vehicles are on the move. For example, Daimler trucks are fitted with proximity control, stop-and-go assist, emergency brake assist, lane-keeping assist and 3D maps to help drivers maintain safe distances from other vehicles. The firm has also developed stereo cameras and radar sensors to help improve driver response times.
The internet of things brings multiple implications for datacentre and storage design. First, it will be critical to get the data off devices, which mostly contain little internal storage, and onto a secured, backed-up storage system.
Not only is this data unique, but it can be very valuable. It could include environmental data that enables the business to track costs, or core sample data from exploration teams in the Arctic, for example.
Data from sensors and the like comprises large numbers of small chunks of data, and will require high levels of I/O.
Much of this real-time data will be stored in databases and, to be analysed correctly, will need to be processed in the right order. For example, a temperature increase may not be correctly correlated with other data – such as component wear – if the data points arrive in the wrong order.
Read more about the internet of things
- Vigitrust’s Mathieu Gorge discusses the explosion of data that comes with the internet of things and its implications for data storage and compliance.
- Find out more about the internet of things in this guide: read case studies and learn how to manage growth in connected devices and objects and the IoT’s impact on your organisation.
This implies the need for very fast storage – certainly solid-state drives (SSDs) – especially if processing is to be performed as near to real time as possible.
Over time, this is likely to drive demand for the kinds of high-speed, flash replacement technologies being developed by a number of manufacturers, such as Samsung’s multi-stream controller technology, which lists lower power among its advantages; Intel’s 3D Xpoint; and, further out, magnetic RAM (MRAM).
Large data objects, on the other hand, require sequential transfers, so object storage could be the best way to manage, store and retrieve this type of data.
So, one of the biggest changes will be the need for more storage capacity. This means greater capital expenditure, greater pressure on the in-house datacentre and the cloud provider to keep a lid on costs, and a need for more storage management by IT teams.
Enterprises may decide to fall back on cloud-based storage, which is also where much data analytics processing could take place.
As the number of devices in businesses and homes grows into billions, the supporting infrastructure needs to change. Large volumes of transactions from large numbers of distributed devices will stretch many central datacentre systems to the limit.
Much of the traffic – raw data from the field – will be incoming. Processing may occur in the datacentre, but, to avoid distance latency, it is more likely to be downloaded and processed locally.
To be useful for real-time analysis, IoT data will need low latency between servers and storage, so these functions will need to get closer and, in some instances, almost merge, as with hyperscale and hyper-converged infrastructure.
If using the cloud, the key is to ensure the provider’s service-level agreements (SLAs) include metrics such as latency between storage and processing functions.
And, because low latency is critical, we are likely to see more smaller datacentres built to be closer to the data. The results of analysis may be fed back to the core, but the data itself need not be. This will also affect wide area network (WAN) bandwidth requirements.
Datacentre endpoint devices may need to change because data flows into the datacentre will increase – until now, the datacentre has generally been a generator of data for consumption. The volumes and types of data will fluctuate, as immature technologies evolve, creating a need for the datacentre to remain flexible, able to expand and contract as required.
Security continues to be critical, with sensors often being fairly insecure, so the enterprise will rely on the security provided by the datacentre or the cloud provider. Elsewhere, the variety and numbers of devices will create security challenges, as inter-relationships between them add complexity.
Start planning now
One thing that the IoT’s machine-to-machine (M2M) communication has going for it, compared with the human-driven data streams that most systems are designed to handle, is that devices can be told what to do and when to do it, which may help designers of tomorrow’s processing and storage systems to even out the data flows.
Despite that, the IoT has profound implications for the datacentre in general and storage in particular – high speed, low latency and high capacities are likely to be at a premium. The time to start planning is now.