WavebreakmediaMicro - Fotolia
Industry forecasts suggest that more data will be generated at the so-called edge than within today’s existing computer networks. The edge is regarded as the next frontier, bridging the analogue world to the digital world, where network connectivity enables data from analogue sensors to be streamed and processed to improve reliability, enhance customer experience and boost efficiency.
In an industrial context, data acquisition devices are used to collect data from sensors to measure heat, light, pressure or any other parameter that can be monitored to ensure that a machine works safely and optimally. Industrial control systems then process this data to make decisions that control how the equipment being monitored functions.
In the past, all data processing was run locally on the industrial control system. But while there is industry consensus that real-time data processing for decision-making, such as the data processing needed in an industrial control system, should be run at the edge and not in the public cloud, there are many benefits in using the public cloud or an on-premise datacentre to assimilate data across installations of internet of things (IoT)-connected machines. Such data aggregation can be used to improve machine learning algorithms.
In the past, data acquisition and control systems were considered operational technology, and so were outside the remit of enterprise IT. But, as Tom Bradicich, global head of the edge and IoT labs at HPE explains, IT has a role to play in edge computing.
Bradicich’s argument is that edge computing can provide a converged system, removing the need for standalone devices that were previously managed by those people in the organisation responsible for operational technology (OT). According to Bradicich, convergence is a good thing for the industry because it is convenient, makes it easy to buy devices, lowers cost, improves reliability, and offers better power consumption because all the disparate functions required by an industrial system are integrated in one device.
Bradicich believes convergence in IoT will be as big as the convergence of camera and music players into a device like the iPhone, which made Apple the biggest music and camera company in the world. For Bradicich, convergence at the edge will lead to industry disruption, similar to what happened when smartphones integrated several bits of functionality that were previously only available as separate devices. “The reason Uber exists is because there is a convergence of GPS, the phone and the maps,” he says. “This disrupts the whole industry.”
According to Forrester’s Predictions 2020: Edge computing report, there is unlikely to be one company that provides all the software and hardware required for edge computing. It advises organisations looking at edge computing to avoid trying to develop something themselves. “As companies deal with bandwidth and connectivity limitations throughout the world, they will soon realise that edge compute platforms and connectivity are too complex and costly to design, maintain or connect,” says Forrester. “Companies will work with edge compute integrators for a particular market to support their edge solutions instead of building and deploying their own.”
Forrester analyst Abhijit Sunil adds: “We feel the edge ecosystem involves much more than what a single partner can bring. This will become more evident during 2020, when customers try to implement edge computing. The connectivity to end-point devices, management of storage, planning and deployment may all come from various sources.”
Read more about edge computing
GE’s digital division is now a fully fledged software business with the introduction of an asset performance management system.
A new phase for the industrial internet of things: Microsoft Azure Sphere signals a transition from IoT experiments to business outcomes in the manufacturing vertical market.
Forrester has forecast that the edge cloud service market will grow by at least 50%. Its Predictions 2020 report notes that public cloud providers such as Amazon Web Services (AWS) and Microsoft; telecommunication companies such as AT&T, Telstra and Vodafone Group; platform software providers such as Red Hat and VMware; content delivery networks including Akamai Technologies; and datacentre colocation providers such as Digital Realty are all developing basic infrastructure-as-a-service (IaaS) and advanced cloud-native programming services on distributed edge computing infrastructure.
Sunil says 5G networking promises high bandwidth and low latency, which makes it a unified network protocol across all devices. “In a retail park or hospital, if 5G is already there, you have a highly reliable, low-latency, high-bandwidth connectivity for edge computing,” he says.
As the market for edge computing evolves, Forrester says tele-communication firms are contributing to edge open source projects such as the Akraino open source, 5G edge computing stack. Colocation providers such as Equinix are also investing in software abstraction layers that run on their distributed infrastructure.
According to Forrester, the providers’ goal is to build out IaaS and platform-as-a-service (PaaS) offerings that do not need public cloud services, but may provide intermittent connectivity to public cloud and on-premise datacentres. “In 2020, this nascent market will begin to see explosive growth as startups partner with enterprises and large vendors to explore possible business models that depend on near real-time responsiveness for customer empowerment,” says the Forrester report.
Sunil believes new categories of device will come to market, aimed at edge computing, to perform data acquisition and real-time data processing. HPE, for instance, recently began backing Pensando, a company run by former Cisco CEO John Chambers.
In a recent blog post about the funding, Mark Potter, CTO at HPE and director of Hewlett Packard Labs, wrote: “By becoming the first solutions providers to deliver software-defined compute, networking, storage and security services to where data is generated, HPE and Pensando will enable our customers to dramatically accelerate the analysis and time-to-insight of their data in a way that is completely air-gapped from the core system.”
For many CIOs, a strategy for edge computing will be entirely new. Sunil urges CIOs to assess what parts of edge computing can be achieved in-house and what should be done through a consulting firm. “A system integrator will play a big role in bringing it all together,” he says.
Chris Lloyd-Jones, emerging technology, product and engineering lead at Avanade, says large enterprises are starting to build IoT platforms to centrally manage edge computing devices and provide connectivity across geographic regions. “Edge computing is no longer just about an on-board computer where data from the device is uploaded via a USB cable,” he says. “Edge computing now handles 4G and 5G connectivity with periodic connectivity, and support for full-scale machine learning and computationally intensive workloads. Data can be transmitted to and from the cloud. This provides centralised management.”
Lloyd-Jones says the cloud can be used to train machine learning models, which can then be deployed to edge devices and managed like any other IT equipment. “Edge computing helps organisations move from disparate devices to a common set of standards,” he says. “This makes it possible to provision hundreds of devices in one go. Standardisation means that microchip updates can be properly deployed.”
Replacing every piece of operational tech with edge computing is a long-term project, but Lloyd-Jones says firms can start reaping the benefits of edge computing in a matter of weeks, and some products can be retrofitted with it.
For instance, Starbucks in the US has fitted Microsoft’s Azure Sphere edge computer to its coffee machines. The IoT-enabled machines collect more than a dozen data points for every shot of espresso pulled, from the type of beans used to the coffee’s temperature and water quality, generating over 5MB of data in an eight-hour shift.
According to Microsoft and Starbucks, the connectivity means that coffee shops can not only identify problems with their machines, but Starbucks can also send new coffee recipes directly to machines, replacing the manual task of updating every machine via the USB. Jeff Wile, senior vice-president of retail and core technology services at Starbucks Technology, says: “We have to get to 30,000 stores in nearly 80 markets to update those recipes. The recipe push is a huge part of the cost savings and the justification for doing this.”
According to Lloyd-Jones, the availability of edge computing devices that support languages such as C# and Python, and Docker containers, has made it easier for enterprise IT departments to start working on IoT ideas.
But some older-generation devices, which have limited memory, storage and processing power, may be Linux-based and can only be programmed in C, he says. “Classroom training is needed to learn how to program in a resource-constrained environment.”
Traditional tools for checking code quality and static code analysis should be used to ensure the code developed is high quality. But, unlike in traditional enterprise software development, where some compiler warning messages can be ignored, such warnings indicate to the programmer that something is not quite right. So, in a resource-constrained device, potentially operating a safety-critical system in a harsh environment, is it worth taking the risk?
Enterprise software developers need to grasp not only how to write efficient code, but also to understand how the edge computing apps they develop fail in a safe and predictable manner. At a higher level, there is also the question of what goes into the edge device. The idea of lean manufacturing – building a minimum viable product, deploying it and having a feedback loop to develop enhancements – is unlikely to fit well where the edge computing device is used as operational technology to control machinery.
The question is then: how much functionality should be built into the product? If a sensor that is needed to measure something in the physical world is not fitted to the edge computing device, the data that could have been captured is simply not being acquired.
“As soon as an edge computing device designed for one use case maxes out in terms of its uses, it is no longer valid,” says Sunil. “The device is redundant. Instead, organisations need to think ahead about how to make use of the data they plan to collect to drive a better customer experience.”