YOU ARE AT:Test and MeasurementHow is testing and assurance keeping up with evolving networks?

How is testing and assurance keeping up with evolving networks?

Cloud, Open RAN and increased complexity means visibility is more important than ever

Telecom networks are facing a slew of simultaneous changes that are increasing complexity, from the deployment of 5G and the addition of new, higher frequency spectrum bands to the cloudification of the network and the exploration of Open Radio Access Networks.

In a conversation during RCR Live: Telco Reinvention in London, the head of Spirent Communications’ 5G market strategy, Stephen Douglas, and Ian Fogg, VP of analysis at Opensignal, discussed the implications of rapidly changing networks on how those networks are tested, assured and optimized.

For his part, Douglas said that he sees the emergence of ubiquitous cloud-based networks as the biggest change for the service providers that Spirent works with.

“Not only are they bringing in a new capability which maybe in-house, they don’t necessarily have the skillset set for, it’s [also] changing the operational or the life cycle paradigm in terms of how they get the software releases” through multiple vendors, Douglas said. As a result, he continued, operators are bringing in CI/CD and continuous testing. “You’re starting to see that silo between the lab, and the live environment now starting to become a lot more blurred because they’re seeing this huge volume and velocity of releases coming from the different vendors. It’s not just about validating them for conformance, they now need to make sure they interoperate,” Douglas said. “This is happening on such a scale that if they don’t put something in place that’s automated, that’s continuous, then they won’t benefit from any of the agility off it. I think that’s one of the big change areas that we’re starting to see.” Operators need to change life-cycle or operational processes, he said, and that is changing how they approach testing and integration.

Fogg pointed to increases in complexity in the radio frequency demands placed on chipsets and devices as having a substantial impact on testing. “It wasn’t that long ago that a device, a UE [user equipment], connected to one frequency band at a time,” he said. “Now they connect to three, four, five—and the types of spectrum is much more varied than it was and that adds complexity.” The addition of a “vast amount” of higher capacity spectrum “drives higher speeds, higher throughput [and] better video experience—which means there’s risk of a competitive deficit from one operator to another, if they aren’t on the ball and pushing out forward,” Fogg noted. Which means that network operators not only need to know how their own networks are performing—and have sophisticated tools in order to provide that view—they need to know how they match up to competitors in terms of optimization, he said. “Maybe there are 10 locations where the network you think is okay, but actually five of them, your competitor is considerably better than you,” he offered. “That obviously gives you then a prioritization of where you need to address that competitive deficit.”

Douglas said that one of the changes he has seen with 5G is large-scale enablement of active remote testing from the field: Having static devices deployed in the network that are synthetically injecting traffic, continuously. “Most of the North American service providers now have this deployed on scale where they’re getting that visibility, repeatable [and] consistently from different markets all of the time through their mobile core,” he said. That approach has been used in transport networks for years, Douglas said, but has now moved to the RAN to be used to rapidly or proactively identify issues or triage quality problems.

Network assurance is also having to keep up with telco’s desire to expand how they serve the enterprise segment, whether that’s through private networks, slices or guaranteed service-level agreements. The most important factor for an enterprise may not necessarily be hitting a specific speed or latency KPI, but in providing a strictly bounded service range that is consistent. “When you think about many applications, you don’t need to have a really fast average speed or really high peak speed,” Fogg said. “What you need is a level of consistent quality that is the floor that you can be sure is going to be available most of the time. The trick is, you need that in every location, at every hour of the day; at busy times as well as at quiet times; in tricky locations, a long way from the cell tower as well as close to the cell tower.”

The proliferating sources of network data in some ways pose a challenge in and of themselves, as well. “I think a lot of people think data is truth, and data is just what you measure,” Fogg said. “Getting meaning from data requires understanding of the source of data and how to interpret the data.” The data sets telcos can use have their own increasing complexity, in other words.

“We’ve been a great industry for years of collecting lots of data, having lovely big data lakes and not knowing exactly what to do with it,” Douglas said. “It is about what you do with it and what you do rapidly with it. That’s always been the goal and the challenge.

For years, he continued, the industry has passive monitored networks, then spent hours or days collecting data on whether an anomaly is actually impacting services or performance and then waiting for customer complaints and sending an engineer to investigate. Why wait and watch, he challenged, when the technology exists to trigger a remote test that can tell you whether it’s a real issue or not? “We’ve got to be smarter how we think about utilizing data,” he concluded. Data has to be usable and the industry has to be smarter about using it, he went on, “otherwise, we’ll just continuously build huge data lakes, and big storage companies will do lots of great business with that—but that’s not going to make an efficient network.”

ABOUT AUTHOR

Kelly Hill
Kelly Hill
Kelly reports on network test and measurement, as well as the use of big data and analytics. She first covered the wireless industry for RCR Wireless News in 2005, focusing on carriers and mobile virtual network operators, then took a few years’ hiatus and returned to RCR Wireless News to write about heterogeneous networks and network infrastructure. Kelly is an Ohio native with a masters degree in journalism from the University of California, Berkeley, where she focused on science writing and multimedia. She has written for the San Francisco Chronicle, The Oregonian and The Canton Repository. Follow her on Twitter: @khillrcr