As the world’s networks continue to grow and become more distributed, and customer demand for fast, low-latency network services rises, service providers are increasingly evaluating and investing in edge computing technologies that allow them to move data processing from the center of their networks to the edge.
The roll-out of new 5G networks is one of the forces driving service providers’ growing interest in edge computing. 5G’s faster speeds enable new high-resolution cloud gaming, industrial IoT process control, on-premises Augmented Reality (AR) guidance, and other new applications that will transform markets. But for these new 5G applications to fully realize their potential, edge computing is needed to provide the low latency these applications require.
The emergence of 5G, along with greater demand for network services that allow people to work from and entertain themselves at home – a trend which the recent pandemic has served to accelerate – means that service providers are no longer asking if they need more edge computing capabilities, but instead asking how fast can they bring these new edge computing capabilities online.
5G Drives Need for Edge Computing
Service providers have invested a lot to build out their 5G network coverage and capacity over the past year. However, they need more than just 5G if they want their networks to support new low-latency applications. In particular, they need edge computing capabilities that eliminates network bottlenecks by moving data processing from the core of their networks to their edge.
Service providers can use edge data centers to process low latency application data close to the edge devices that generate and use this data. By not sending this data back and forth from the edge device to a remote data center, they can significantly reduce latency, encouraging greater adoption of new, low-latency 5G applications – and customer interest in their 5G network services.
What to Consider When Building Out Edge Computing Capabilities
However, service providers must consider a wide variety of factors as they decide when, where, and how to build out new edge computing capabilities. For example, one of the key ways in which service providers can improve their edge computing capabilities is with edge data centers.
As they build out edge data centers, they need to ensure they are close enough to the target edge devices to provide the lower latency these edge data centers are designed to deliver. They also need to ensure that their edge data centers have enough square footage for the racks and cabinets required to process data at the edge. In addition, they should consider how they might need to expand these edge data centers in the future, as demand for low latency applications increases.
In addition to the physical location and design of their edge data centers, service providers also need to consider how they can integrate the management of these edge data centers into the management of their overall network. For example, new unified management platforms allow service provider IT teams to manage both their LANs and WLANs from a single dashboard, providing them with better visibility on the performance of their edge data centers and other network infrastructure. In addition, with unified visibility on both their wired and wireless domains, service providers' IT teams can optimize edge data centers and other aspects of their network infrastructure to address their current business needs, and better forecast network growth, helping them make the edge computing investments necessary to address their future business needs as well.
Unified management platforms also enable converged edge networks in which many routine network tasks are automated. For example, with these platforms, network profiles and configurations can be pre-defined, then pushed to new switches and access points (APs) as they are deployed on the network. By automating routine tasks, these platforms also free up IT staff to focus on more strategic and high-priority tasks.
Service providers have to also consider how their edge data centers and other network infrastructure will prioritize and route network traffic if they hope to ensure that their networks meet key SLAs even when they are using other service providers' networks. This is likely to require service providers to standardize more of their network infrastructure to support virtual network slicing. This type of standardization will also help drive the development of new off-the-shelf modular network components that service providers can use to reduce the time and costs involved in maintaining their network infrastructure while also allowing them to lower mean-time-to-repair.
By developing a plan to build out and integrate new edge computing capabilities into their networks, service providers can use 5G and edge computing to transform how we use communications technologies. With 5G networks expanding and edge computing technology maturing, expect to soon see superfast, low latency networks delivering us new cloud gaming, Industrial IoT, AR, and other applications that fundamentally change the way we work and play.
Eric Law is Vice President, Enterprise, United States and Canada at CommScope.