Understanding the IoT business environment
A comprehensive collection of articles, videos and more, hand-picked by our editors
Growing demand for low-latency web connections as the internet of things (IoT) trend takes hold is set to drastically change the way the datacentre industry operates over the coming decade, market watchers predict.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
IoT will fuel demand for low-latency connections and faster data processing times, but user expectations will be impossible to meet unless the industry commits to building datacentres closer to where users are, according to speakers at the 2016 Datacloud Europe conference in Monaco.
Mark Bidinger, president of the cloud and service provider segment at datacentre energy management firm Schneider Electric, said the rising number of internet-connected devices meant the industry must rapidly rethink how it delivers compute capacity.
“The internet of things is going to [drive the need for] a new method of deployment in the cloud computing industry, and the datacentre industry as a whole, meaning it needs to be more agile in both compute and deploy,” he said.
In the IoT era, datacentre providers will need to collect and process massive amounts of data, before sending back the produced insights to users so they can make real-time decisions, he continued.
Speed will be crucial in these scenarios, meaning the closer the industry can get this data processing power to the users, the better.
To emphasise this point, Bidinger talked about the commercial implications of relying on high-latency web connections, in the context of internet giants such as Google and Amazon.
“A half-second delay will cause a 20% drop in Google’s traffic,” he said. “From an Amazon perspective, a one-tenth-of-a-second delay [in response] will impact on 1% of its sales. These [delays] are irritants and they need to be removed.”
Closing the supply and demand gap
During a separate panel discussion at the show, Eric Schwartz, European president of colocation giant Equinix, said his organisation was well-versed in the challenges colocation firms come up against when trying to serve the needs of a global user base from a couple of locations.
Eric Schwartz, Equinix
“More than half of our revenues today come from customers that deploy with us in multiple regions around the world,” he said. “Many of our customers are deploying applications and services that are intended for global markets, yet you can’t serve a global market from one or two locations. You have to get close to the users.
“They need their infrastructure to be distributed to reach the performance requirements of their customers, and that needs to be at a speed that has never been seen before,” Schwartz added.
Life on the edge
In terms of where these edge datacentres should be located, in another session at the show, William O’Connell, director of sales at datacentre provider Commscope, said colocation providers should consider taking over old telephone exchanges.
“Carriers are depopulating these buildings and there is a lot of real estate in Europe that was previously used for telephone exchanges,” he said.
Although these types of sites will require upgrading to meet modern datacentre standards on power and cooling, they already have the benefit of being well-served by copper or fibre network connections.
“There is fibre or copper connectivity into these buildings and they will – I suspect – be prime locations for some of these edge applications,” said O’Connell.
Read more about edge datacentres
- BBC data scientist and Coca-Cola CTO question if the datacentre is the right place to store, process and analyse internet of things data.
- Enterprises are demanding faster access to applications and ever-more processing power. Could edge datacentres fill the gap?