Edge deployment strategy dependent on 5G use case being served
SANTA CLARA, Calif.–Whether it’s autonomous driving, AR/VR-type applications or myriad data-intensive smart city initiatives, moving compute, storage and processing power closer to the network edge will be key to delivering on 5G use cases. That was the message from Alicia Abella, AT&T VP of Advanced Technology Realization, during the Fog World Congress event this week.
“In order to achieve some of the latency requirements of these [5G] services, you’re going to need to be closer to the edge. Especially when looking at say the autonomous vehicle where you have mission critical safety requirements. When we think about the edge, we’re looking at being able to serve these low latency requirements for the application.”
She listed a number of benefits to operators that can be derived from edge computing:
- A reduction of backhaul traffic;
- Cost reduction by decomposing and disaggregating access function;
- Optimization of central office infrastructure;
- Improve network reliability by distributing content between the edge and centralized data centers;
- And deliver innovative services not possible without edge computing.
“We are busy thinking about and putting together what that edge compute architecture would look like,” Abella said. “It’s being driven by the need for low latency.” In terms of where, physically, edge compute power is located “depends on the use case. We have to be flexible when defining this edge compute architecture. There’s a lot of variables and a lot of constraints. We’re actually looking at optimization methods.”
Another aspect of deploying edge computing infrastructure Abella touched on again spoke to the physical deployment model–if you need to put a bunch of new infrastructure into the field, where does it go?
“We want to be able to place these nodes in a place that will minimize cost to us while maintaining quality of experience. Size, location and configuration is going to depend on capacity demand and the use cases.”