Quality data, digital twins and AI/ML will all help operators enable end-to-end automation in a multi-cloud environment
Managing the complex components of a multi-cloud environment is proving to be a monumental challenge for network operators, especially coupled with the growing need for automation techniques driven by AI and ML to streamline operations. However, as Viavi Solutions’ Regional CTO Chris Murphy told event attendees at the Telco Cloud and Edge Forum, it’s more than worth it for operators to overcome these hurdles because the opportunities around end-to-end automation in a multi-cloud environment are “wealthy.”
Murphy was joined by BT Group’s Managing Director of Research and Network Strategy Gabriela Styf Sjöman, who was able to provide the telco perspective. Automation is nothing new, she said, but added that now, the operator is starting to “expand” its use of this capability throughout the “telco cloud environment” in a way that is more “horizontal.”
“We’re doing … more horizontal automation. Where we predominantly are exploring all of these opportunities, and … use[ing] AI for performance optimization, power tuning, assurance,” she continued. “We’re doing quite a lot around anomaly detection, and we’re using this across all our domains, both in the fix and the mobile network today.”
Styf Sjöman was adamant that a cloud-native, decoupled network is “critical,” particularly because it is the only way to enable what she called “auto-healing.”
“Today in the telco world, even when it’s cloudified, many of our services are hardcore integrated down to the infrastructure… Even when we cloudify, we still have these silo stacks. And then you cannot have the network doing it by themselves … [T]o be able to do … this auto healing … you need [to be] truly cloud-native. Even the applications need to be cloud native, the CNFs need to be truly cloud-native,” she argued.
For Murphy, digital twins also are key to enabling end-to-end automation in a multi-cloud environment: “Digital twin is emerging as a very powerful enabler for this sort of thing because it allows you to model your network, not just simulate it, but represent it as something which is a proxy for reality, which means you can run what if scenarios, you can understand where your weaknesses in your network,” he said.
Don’t move the data!
Another important consideration in this discussion, according to both experts, is data: where to put the data, how to get the data, what to do with the data and so on. All of these decisions require a great deal of consideration. “Not all data is the same,” noted Murphy, adding, therefore, you must be “careful that data is collected correctly” as it can be subject to human error.
Styf Sjöman also warned against moving data. “Data should not be moved around. Instead, we have to be able to kind of abstract that data and create a layer, democratize the data, but we feel that moving around data and putting it somewhere is not the way to go,” she explained further. “It’s very costly, probably doesn’t add a lot of value, but then that’s where we are, and I think [it’s] still up for discussion.”
In general, Murphy agreed with this sentiment, commenting that data will be “very, very location specific” in most circumstances and so it’s “most relevant to where it’s being generated,” suggesting that we shouldn’t be “hauling it around.”
“Because we need to achieve the end-to-end intent-driven automation and autonomous networks … we do need models to help us to drive the AI to do that because we can’t do it manually, particularly if we’ve got infinite network slices … and we do need the quality of data,” stated Murphy, adding though, that the industry must approach data generation and analytics with caution, as well as how emerging technologies are being used.