Analysts predict organisations will require more numerous and faster data processing and services at the network perimeter
Micro data centres that share or process data at the network edge, communicating beyond the edge to a different network, will be a challenge to manage – which represents a prime opportunity for the channel. Indeed, use cases are gradually emerging around optimising low-latency applications distributed through edge devices, according to the research director for distributed data centres at Uptime, Tomas Rahkonen.
Examples include retail applications or on-site live event ticketing, versus the "more private" edge or Internet of Things (IoT) scenario, such as industrial edge deployments in a single factory. Performing and securing critical functions must facilitate information processing in real or close to real-time.
"You definitely need to think about the security, including physical security, with shared edge," Rahkonen says. "You may need a strategy for when someone gets into the facility, for example."
Play to win Solid planning and risk assessment will be crucial, incorporating the usual factors such as resiliency, service levels, site design, costs, business requirements and limitations of different networks and stakeholders. These, though, should be considered across the full array of endpoints, systems, or processes.
"You need monitoring data, you need to assess if it can happen without staff. If they do so, what happens? Then really go a bit deeper there," Rahkonen suggests. "How do they reach all the sites? Do they set up a network of local maintenance? It can be like you're operating 100 data centres, perhaps inside the companies; there're local fibre providers, there's all that complexity."
Services providers could address these customer pain points, smoothing the path – from building a use case to planning and analysing what's required, to deployment and ongoing managed services and collaboration. Most organisations won't want the hassle of doing it all themselves, Rahkonen suggests.
Nik Grove, head of hybrid cloud at ITHQ, has worked on containerised, modular or micro data centre type solutions for clients. Micro data centres "absolutely" have a place in branch locations or anywhere a glut of processing and compute needs to be done on-site, not pulled back out to the cloud, he says.
Any data centre – modular, self-contained or traditional – still needs monitoring, storage, utilities, services, connectivity, security, governance and so on, and one or more micro installations at an endpoint will typically magnify complexity and cost, he agrees.
"Where are you going to put the thing first? Is it going to sit in the middle of the office? The physical location is still as important as ever. Then you need to be able to monitor it remotely," Grove says.
"People may not be on-site to immediately fix a problem, so you've got to have more aggressive monitoring. If it's in such a small footprint, too, is it going to overheat?"
Questions, so many questions
Ask how fast services can be deployed, how they will be monitored and managed, considering time to market and multi-year infrastructure strategies.
Will it be a couple of servers in a secure rack? What happens if the business or workloads grow? How much sense does the deployment really make? All these questions need to be worked out and chewed over before anyone commits.
"It's something new the tech industry can get their teeth into, but you've got to make sure it should be there. It's going to need to be fed and watered and maintained," Grove says.
"And, if you're talking about sharing any kind of edge facilities, it just gets more important that you understand what you're getting yourself into before you start."
Jon Abbott, technologies director for global telecom strategic clients at Vertiv EMEA, notes that machine-to-machine connectivity entails an ability to consolidate data from myriad devices in real-time. Regardless, "probably no" edge applications or services yet require such low-latency, cross-network communications, with even live proof of concept outside the lab some way off.
"The issue is how do two devices connected to two different networks benefit? After all, connectivity and authentication are currently made further upstream in the network," Abbott says.
"Precious milliseconds are lost while it happens – especially painful if the service being accessed relies on low latency."
Get ready for the shift
While it may not be an opportunity today, the rate of innovation "can change that horizon incredibly quickly", with multiple service providers working on solutions involving functional splitting or emulation that would move authentication nearer the user, Abbott says.
Simon Richardson, principal technologist for UK and Ireland at VMware, also indicates that while sharing edge comes across different, independent networks isn’t something he’s currently exploring in pre-sales, where edge today is typically around infrastructure deployment.
...
Comments