YOU ARE AT:AI-Machine-LearningThree benefits of deploying AI at the edge

Three benefits of deploying AI at the edge

Running AI workloads at the edge enables better economics, faster decision making and automation

If you look past the hype, look past the technological complexity, look past the protracted proofs of concepts, 5G is all about leveraging a high-bandwidth, low-latency air interface to move data, analyze data and action on data in as close to real time as possible. With that capability in place, enterprises of all stripes could realize every possible operational efficiency, automate things that can be automated, and see the benefits on their balance sheets. To hasten decision making and automate where possible, AI has a clear role to play. And, given trends in distributed network architectures and distribution of cloud compute/storage infrastructure, AI residing at the edge, where data is created, brings numerous benefits to enterprise users.

Andrew Keene, head of product management at Volt Active Data, made the case for AI at the edge in a recent webinar hosted by RCR Wireless News. He noted that AI at edge is not just advantageous but “sometimes critical for the viability of the use case.” But why? 

“Some of these use cases generate vast volumes of data, much of which in itself is relatively useless,” Keene explained. “But it all must be processed for the use case to work. Backhaul and transmission costs to the cloud can be prohibitive, but if you can process the data and the edge, and only send valuable consolidated data to the cloud for further processing, that kind of solves the problem.” 

Beyond the economic reality of moving data, timescales are critical—Volt Active Data tends to measure things in single digit milliseconds. “If you could make…decisions much closer, at the edge, to where the events are happening, that again solves the problem of ultra low latency responses.” Another implication here is around data sovereignty/security; enterprises dealing with proprietary or regulated data need to hold that data closely. “So a distributed tiered data platform that can run these [machine learning] models at the edge and only send consolidated, invaluable data to the cloud, mitigates many of these issues and actually makes some of the use cases viable that otherwise wouldn’t be.” 

Back to that idea of deploying AI at the edge to speed up decision making—in addition to doing that, AI can also continuously get better as it has access to more and more data. “We are seeing a big and increasing interest in machine learning models…to enhance many different real-time data processes and use cases across a variety of industries to automate continuous model improvements,” Keene said. This lets the enterprise user achieve a better outcome and potentially pass on improved outcomes to their customers. 

Keene summarized: “What really makes for a powerful solution is when these machine learning models not only continually are updated to improve their accuracy rating by learning from real outcomes, but when you execute them at the edge on a distributed platform to ensure that optimal response times and to reduce unnecessary backhaul of costs of transmitting potentially prohibitively large quantities of data to some centralized Ccloud hosted platform.” 

Artificial intelligence and edge computing enabling the IoT

Volt’s data management solutions designed to support a range of real-time applications, Keene said, calling out fraud and threat prevention, hyper personalization, real time private network SLAs, traffic management, fleet management, charging and policy, IoT device management, compliance and regulatory reporting, edge optimized federated decisionmaking, and active digital twins. But, he noted, “We provide the enabling technology…not the end application,” which is built by their customers and partners. The company’s focus is on “applications that require massive scale, low latency, accuracy and resiliency and are well suited to executing machine learning models at scale.” 

Keene gave the example of using 5G in dense urban areas to improve traffic management through proactive routing; this is in contrast to legacy traffic management solutions like timed traffic lights set to vary through peak, off peak and other static configurations. By introducing 5G and AI-powered processing at the edge, “Smart cities have the ability to take feeds from hundreds of traffic cameras or other endpoints and use machine learning as a form of AI to predict the behavior, spot anomalies as they happen, and to route traffic away from problem areas in real time.” 

Another example that would fall under the Industry 4.0 umbrella is IoT device management. Keene laid out a hypothetical operation where 5G SIM cards are used to connect, track and monitor various people and assets. “We’ve seen experiences of enterprises that have problems with fraud” in the form of people stealing SIMs or assets with SIMs in them. “The solution uses geofencing to bar or alert when a device moves outside its normal operating area and also spots unusual traffic data usage patterns of device changes. The machine learning model learns the regular, predictable usage patterns and operating geographical areas. So if it’s normal for a particular device to go offsite every Thursday afternoon to collect something, it won’t flag it as fraud, whereas any other device that never moves outside its area gets flagged straight away if it happens to stray.” 

To watch the full webinar featuring Keene and other industry experts, click here.

ABOUT AUTHOR

Sean Kinney, Editor in Chief
Sean Kinney, Editor in Chief
Sean focuses on multiple subject areas including 5G, Open RAN, hybrid cloud, edge computing, and Industry 4.0. He also hosts Arden Media's podcast Will 5G Change the World? Prior to his work at RCR, Sean studied journalism and literature at the University of Mississippi then spent six years based in Key West, Florida, working as a reporter for the Miami Herald Media Company. He currently lives in Fayetteville, Arkansas.