AT&T Labs exec says gaining full software control of the network in its ‘early stages’
Dave Walter, AT&T Labs AVP of Radio Technology and Strategy, recently fielded questions regarding not only the operator’s ongoing trials of pre-standard fixed wireless access 5G technologies, but also how virtualization fits into the big 5G picture.
AT&T’s ECOMP, now called ONAP, open source NFV/SDN protocol is helping to take cost out of vital network elements and paving the way for the network slicing that will be fundamental to 5G.
Wolter weighed-in: Â “There’s gains in a couple of ways. Part of it is economics. I can use off the shelf hardware with specialized software to create the function that I’m looking for. This allows me to bring in specific algorithms. It allows us to grow the performance of the system at more of a software rate as opposed to integrated hardware software development which tends to take much longer. As we think about something like ONAP and software-defined networks, and we move towards the edge and we do edge computing, there are probably different control loops you’ll have to think about. Being able to do network slicing that allows us to efficiently address not a sector as a whole or a base station as a whole…but be able to take that right down to the user, all of those things are going to be really important and ,you know, much more than simply cost advantages. We’re in the very early days of this technology and, in terms of the maturity of the hardware and software that’s out there. There’s going to be a lot of maturity and gain in performance based just on the standards that are in play right now. There’s only so much that can be initially created and trialled. As we move on, antenna techniques are going to become better. Different types of physical layer techniques can be made better. This is going to be a long process. I think a lot of it is continued innovation around what’s there. There are also some more fundamental things we’re looking at.”
Specific to the 5G fixed wireless access enterprise and consumer trials ongoing in Austin, Wolter discussed millimeter wave propagation findings from the 28 GHz testing.
In terms of signal loss based on rain, “In the friendly user trial that we are running right now, we have not seen a significant rain impact, but these ranges are pretty short. We also have set up in another testbed…about 30-odd point-to-point links in 28 GHz and 39 GHz…that are in different types of foliage levels, terrain and so forth. We are recording the data from those 24/7, so we will see the impact of rain, snow, ice, all of these kind of things. We’ll get a better idea of what kind of impact we will expect. So far it has not been significant.”
Other trial findings include potential better transmission characteristics in a bounced signal rather than direct point-to-point signal, a process that lends itself to automation.
“That’s part of what we’re discussing with our vendors in terms of tool design and equipment design,” Wolter said. “It depends on the field or regard of the antenna. They can automatically find the best beam, but there may be beams bouncing outside of that field of regard. That’s going to end up either developing a tool we might use for installation that automates that process…or at some installation process you simply move it through a certain range of azimuth and see which one gives you the best. We don’t really know how that’s going to work out just yet, but I’d expect that to be much more automated than it is.”
In response to questions regarding latency findings from ongoing field work, Wolter said it “depends a bit on the particular vendor’s equipment design, but we have certainly seen sub-10 millisecond times of latency over the air and sub-20 millisecond end-to-end. It depends a lot on where the endpoint is in the network. From what we have seen, clearly the latency characteristics are going to be there…but we always have to remembers it’s not just the radio we have to be concerned about, it’s the network as well. It’s got to be a full system design.”