Alert | For more about AI in telecoms from Verizon Business, don’t miss the RCR webinar on April 8 – with panellists also from ABI Research, Red Hat, Spirent Communications, and TUPL. Sign up here.
In sum – what you need to know
AI network evolution – Verizon is leveraging AI in three key areas – customer support, product personalization, and ecosystem building – to enhance network services, improve customer interactions, and automate operations for internal teams and external clients.
AI partnerships for growth – Verizon is forging partnerships around its infrastructure with major players like Meta, Nvidia, Google Cloud, enabling businesses to deploy distributed AI workloads at scale and to optimise AI service delivery on fibre and 5G.
AI backbone networks – Verizon says its infrastructure – including networks and data centers – is critical for AI services. It aims to support the AI ecosystem with programmable networks and compute real estate, and to position itself as a key player in the sector.
There are three “buckets” for AI at Verizon, says Verizon – like for most telcos, by extension. “Each is further along in the company than the next – as far as maturity goes,” explains Steve Szabo, vice president for technology enablement in the technology solutions division at Verizon Business. Generally, these cover quite-new AI advances in quite-old ML practices, mostly covering internal functions like customer care and network maintenance. These will advance further with generative AI, and somewhere down the line with agentic AI, where service interactions for both external customers and internal engineers are automated and autonomous, to an extent – and inter-linked, as well.
Otherwise, Verizon Business is using AI to expose more dynamic product service features to its customers, for transparency and management of their airtime and devices, via application programming interfaces (APIs) and as-a-service portals. This work is also developing to include inherent network features in its standalone 5G (5G SA) infrastructure, as well, as it is upgraded and advanced across the US – so services are scalable and configurable for enterprises “at the click of a button”. And just like with its customer support functions, generative and agentic AI will, over time, make its network services more responsive and powerful.

But the big AI play for operators right now is about how their network assets will map into this global infrastructure build-out to support as-yet unknowable AI services. Verizon is playing its part to construct the new AI ecosystem, says Szabo. In January, it announced a new suite of solutions and products called AI Connect (Verizon AI Connect) to enable businesses to deploy AI workloads at scale. In tandem, it signed a deal with US cloud hosting outfit Vultr to let space in its network infrastructure for Vultr to expand its compute footprint and GPU-as-a-service offer.
It has also announced an expansion to a deal with Meta “across network infrastructure… to build the AI ecosystem”, in some fashion. Besides, there is new work with Nvidia to “reimagine” how GPU-based edge platforms will integrate into its private 5G deployments, as well as a project with Google Cloud around new AI solutions for network maintenance and anomaly detection. This is what the third “bucket” contains. Szabo says: “The strategy is clear – to partner with customers like Vultr, or tier-ones like Meta, so they handle the GPU-level stuff and we get them to where they need to go; and as our network programmability advances, for them to use more network services.”
This discussion about how Verizon Business is using AI to serve enterprise customers is set out below, a bucket at a time. But a quick word from Szabo before we get into all of that, which sets out his company’s position – and tells a story for the whole operator community about how to get a grip on AI to improve internal operations and also to play in the external ecosystem.
Szabo says: “These things will happen over a period of time. Everybody has got work to do; some are more advanced than others. But service providers have built all this stuff already. Our challenge is getting it in a way that the new world is used to consuming – in a more digital-fingertips way. That is the challenge for the telcos: how to take decades of network infrastructure and automate and configure it so it is available at scale. That is where we are making targeted investments. But from a horizontal perspective, I mean, who better to be the hub and the highway – than the hub and the highway? These partnerships are great for both sides. It allows them to tap into what we have, and it forces us to level up, and modernize.”
AI in customer service support
The first of these “buckets” bundles back-end AI to help with customer support functions – “the stuff you’re used to hearing about in this space,” says Szabo. The goal, as always with automation, is to improve operational efficiencies; the context, as always with service, is to reduce “friction” alongside – in this case, by placating disgruntled customers, calling in because they have issues with networks, systems, devices, applications, and so on. “Any friction we can take out of the system is helpful. And these types of things are really important,” he says.
The big change, recent and ongoing, with AI in telecoms-based business support systems (BSS), covering all the software that manages customer-facing activities, is to avail the much-maligned interactive voice response (IVR) system with new brain power – so calls are not just directed to the right department, but to the right person within the department. Szabo explains: “What we’ve been able to do [with AI] is to auto-route their problem to the rep with the most experience in that area. It is matchmaking, if you will – between a service issue and a service rep.”
The AI crawls an expanding database of service calls, which records the nature of each inquiry, along with the source of the problem and the author of the solution. “It is not secret stuff; but it has helped tremendously,” says Szabo. As well, Verizon is using AI to sift information – product manuals, software instructions, network alarms, adjacent BSS and OSS systems – so staff can solve issues faster. Previously, even with the match-making, it was down to staff to know and find information in the back-end system (“on five or six screens”), and to further liaise with domain experts.
Szabo explains: “With AI, we can level-up the rep to quickly access and digest information, and to deliver answers – because the AI can siphon through tens-of-thousands of pages of information very quickly. We are seeing a very high rate, in the 90-plus percentile, in terms of the accuracy of the responses.” So how should we grade the AI in these service enhancements? “It is between early AI and generative AI,” he responds. “We are figuring out how to use agentic AI to support self-help work for customers, where the AI takes steps [by itself, prompted by the customer].”
He adds: “At present, we can give an answer quickly; agentic AI will give customers more power to execute steps and functions – if they get an answer, say, and choose to take an action as a result, without a care agent getting [involved]. Really, the stuff we’re working on now is more about the execution models in the agentic space.”
AI in product personalization
The second area of focus for Verizon Business is the personalization of its product suite to meet rising demand – fast mutating into expectation – from data-minded businesses, increasingly born-digital, for transparency and control over their digital services. “We have a good beat on the types of things to level-up customer [service support]; what we need to do is to extend those capabilities into the product suite, itself. They expect low-touch AI management and personalized products, so they don’t have to call up every time they want to make checks and changes,” says Szabo.
As such the company has been embedding AI into its as-a-service portals so enterprises can understand and troubleshoot the performance of their networks, routers, devices, and apps to bring some two-way dynamism to service management and security, as well as to draw on emerging features in the latest ‘standalone’ version of 5G (5G SA). This is where “lots of time” has gone over the last 24 months, he says.
“We are leveling-up our infrastructure to provide these tools and capabilities and insights, whether that is through APIs, where they can pull [the network] into their ecosystem to use however they want, or through our own management and configuration tools, where they can get insights and control over their devices and networks. Because they don’t just want us just to proactively let them know; they want to see it for themselves, and have their own eyes on it. They want complete transparency and visibility, and AI lets us give them that.”
He has an example about this kind of AI chain-of-thought around a sudden data spike on a device in the network. The firm is offering “real-time rating and usage with AI”, he explains, where some root-level AI algorithm prompts actions in response to ML traffic alarms in the network. “If a device goes rogue, and constantly pings the network and racks up a huge bill, then maybe it has been hacked, or maybe it is just on the wrong plan. But now they can evaluate it, right away – whereas previously they wouldn’t have the information until they had their bill,” he says.
So just grade that as an AI exercise, as well. Szabo responds: “It is early-stage AI insofar as you can take a lot of information, and deliver visibility and insights [about ot]. It is not a one-for-one,” he says, suggesting somehow that its value is more profound, and scalable. The point is it is the same kind of pattern, as with customer support – where Verizon’s back-end systems are more accessible, because AI is simplifying information and rendering insights. In this case, these are literally more accessible, because they are being opened to enterprises via APIs.
It is different from just rule-based big data analytics, which has underpinned most service management platforms until now. Szabo responds: “This correlates [responses across] a variety of data sets – network usage and performance, location management, cyber security and cyber threats – to identify potential issues. The network is built, and the first step is to proactively get tools and insights into the hands of customers – to correlate data as AI. The next step is for customers to use AI to automatically change everything on the network – when they want.”
He adds: “But they don’t have those capabilities yet; those are things we’re investing in now.”
AI in ecosystem building
Which gets us, quickly, into interesting discussion about the telco’s role in the broader AI ecosystem, and how to make a virtue of its distributed infrastructure, variously incorporating national and regional data centres, metro-edge multi-access edge compute (MEC) sites, and even cell towers and radio equipment. Plus both its fibre backhaul and cellular access networks, of course – as strung between these properties. This is the third “bucket”, as described by Verizon Business, and it pitches the telco talk about AI as-a-service into this digital gold rush on AI infrastructure.
In the end, it is about the interplay between this industry, building powerful AI networks, and the big beasts of AI, building powerful AI engines, and how they connect to deliver powerful AI services – and where and how business is shared between the lines. “Everyone is doing things in this space, but everyone knows they can’t do everything on their own. So the question is: how does everybody work together without stepping on each other’s toes? That is the issue – that everyone is encroaching on everyone else’s revenue streams,” says Szabo.
“Which just drags out our ability to quickly-deliver what the market wants; I mean, that is my opinion”, he adds, and he goes on to highlight the work of various of these protagonists to draw the lines, and collaborate on common goals. He flags his firm’s own work to make its data centre assets available for companies to deploy AI hardware, and its new AI Connect suite of products and solutions to “enable businesses to deploy AI workloads at scale”. Google Cloud and Meta are taking network capacity for AI workloads; it claims a $1 billion sales funnel for its AI Connect offerings.
It has just signed with cloud hosting firm Vultr, which is to expand its compute footprint and GPU-as-a-service offer via Verizon’s connectivity infrastructure in the US, and to “hook directly” into its fibre network. Szabo says: “It can use its experience [with] customers, and we will power everything on the backend to get it to the co/los, cloud sites, edges. So we become a crucial component for the transport. We will level-up our programmability so customers can go from 1Gbps to 10Gbps – to 100Gbps, if they need it; they can build the pipes in real time to wherever they want.”
He zooms out: “We bring a lot to the table – if you look at our assets and land; our space, power, and cooling; our networks. We have a ton of traction; our funnel is big in [shared and dedicated] fibre – just to transport these AI workloads everywhere. It is pretty remarkable; the requests are coming our way.” There is an argument to say operators are lucky, to an extent; that AI workloads require distributed cloud systems for purposes of efficiency and performance, and that telecoms networks already go to most places, and distribute compute power along the way.
In ways, as well, the opportunity for operators to rent space in their networks and data centres to the rest of the AI ecosystem builds on their original vision for digital services to be sprung from MEC infrastructure. Certainly, there are lessons from the MEC-era, says Szabo – and also steps that should not be repeated. “MEC was pretty far ahead of its time. The device ecosystem and other things were just not ready. But we know the edge is super critical, and people now understand the values and the outcomes better. But the use cases are different – MEC versus this.
“If you just look at MEC as an edge-play to get closer to the customer and the outcome, then there are similarities. But how it is commercialised, and the types of players is very different. Our ability to partner will likely involve us sharing our space with other companies. But it is not just about hyperscalers and telcos; folks can do different things with GPUs at the edge… A lot of that stuff is still in play – which is why, for us, the meat and potatoes is the network – and the space, power, and cooling – because you need those things just to activate where we go next.”