Site icon RCR Wireless News

How are network testing and monitoring changing in modern networks?

Deutsche Telekom

‘The ultimate test, is does the user get a good experience?’

5G, Open RAN, millimeter-wave and midband spectrum—networks are evolving faster than ever. Operators must ensure that these rapidly-changing networks are working as expected and providing human users and connected devices with the level of experience and performance that they need. How does testing keep pace?

In a recent conversation during Test and Measurement Forum, that question was tackled by Ian Fogg, VP of analysis for Opensignal and Jon Van-Orden, who is the head of engineering for NEC’s 5G Center of Excellence.

Right now, 5G testing is by and large focused on the new features and performance of 5G; in time, the main emphasis will shift more toward 5G-specific services, according to Van-Orden. “A lot of the work that we’re doing at the moment is really an extension of what we’ve been doing before,” Van-Orden said. “But … there’s going to be new services coming in. So things like massive machine type communications, things like the internet of things, more edge compute activities going on, with ultra-reliable, low latency communication (URLLC) happening. [These] are going to require new ways of not only testing the solutions, but also monitoring and measuring how that is actually really performing in the network. So that’s how I see the things evolving over the next few years, is really more focus on new services.”

Van-Orden said that NEC is spending a lot of time testing the performance of Open RAN 5G solutions, particularly 5G radio features, as well as cloud-related testing.

“We’re seeing the network changing from being a monolithic box you deploy on-site, to now being something which is split potentially across multiple locations—where part of it may be running in the cloud, part of it may be running actually on the site. And that presents in itself its own challenges that we have to test,” Van-Orden said. “There are new interfaces that have to be verified, there are new aspects that have to be looked at.”

Fogg ticked off a number of complicating factors for current testing: Devices that must connect simultaneously to multiple frequency bands across a wider range of airwaves, and in general, the multiplying number of features and capabilities that are available to carriers across both the Radio Access Network and the 5G cloud core. “There’s lots of complexity,” Fogg said. “And that means that you need to have multiple ways to measure what’s happening. … You need ways of comparing not just how your network is doing to make sure it’s working in the way you think it is, which is where an outside in methodology is very useful, but also to compare against your competition.” After all, he offered, there is a direct relationship between the network quality that people experience and their likelihood to drop their carrier and try another. “The ultimate test, is does the user get a good experience?” Fogg said. “Does [the network] actually do the right thing that you think it’s doing?”

Meanwhile, the demands on networks to provide different types of experiences for both humans and IoT devices is changing as well: Massive machine-type communications, Fixed Wireless Access and traditional mobile users—each one with very different needs in terms of bandwidth and latency requirements—are all increasingly being served from the same network. “The operator is still using the same spectrum, the same technology to deliver those services, but now he’s got different requirements and different end-users requiring different things from the same infrastructure. That becomes incredibly complicated to manage,” Van-Orden said, going on to note that a decision to optimize one particular service can have big impacts on how well another one performs.

FWA has indeed been an early 5G success for customer adoption and revenues. Fogg pointed out that for FWA offerings to be competitive, they can’t just be good compared to other cellular services. “For a fixed wireless offering to be competitive in the market, it needs to be good enough against completely different network technologies” like cable/hybrid fiber-coaxial networks and DOCSIS technology, and fiber-to-the-premise networks. Apples-to-apples observations about performance are crucial to make the case to consumers.

Observability is key, Van-Orden said at one point. But how do you balance breadth and depth of testing, when the network can provide ever-more data that may or may not be actionable or relevant to performance and end-user experience? “Clearly you want to go broad, but you also want to go deep. If you try and go broad and deep, you just end up swimming in large amounts of data and not being able to understand what’s happening,” Fogg said. “If you go super deep without having the breadth, you are going to miss things.” He gave the example of adaptive antennas that can alter how they work depending on the usage pattern in the cell. “That means that you really need mass measurements in that cell to really understand what’s happening in the cell over hours of day. You can’t just go in there, walk along once in a day, see what’s happening and then go away. That’s not going to tell you what the deal is,” Fogg said. “That might have told you years ago when the network was a lot simpler, but now it doesn’t really help you as much as it did.”

There was some hope, at least, that both time/maturation of various technologies and automation will help the industry to navigate the amount of data. “New features have to be tested to a deeper level than the existing preexisting features. So you’ve always got this trade off between regression testing of existing features and the depth of the new features,” Van-Orden explained. “And that’ll be the same for operators introducing a new feature into their network as well. I think the key to it [is] automation.” He cited the example of the RAN Intelligent Controller, which opens the possibility of tweaking and adjusting site behavior to get the best out of the network—but it requires automation, and algorithms trained on relevant data sets, which may or may not carry over from one network to another. (And, as Fogg pointed out, such algorithms still need to be monitored in the real world to make sure they’re doing what an operator wants them to be doing.) “I think the collection of data, the automated analysis of that data and then using machine learning to make decisions based on that data is going to be the way to try to address some of these questions about how do you get the best without just drowning in data,” Van-Orden said.

For on-demand video access to Test and Measurement Forum sessions, go here.

Exit mobile version