NXP and Nvidia both investing in AI for edge use cases
The vision for the combination of 5G and the internet of things revolves around leveraging the low latency of next-generation cellular networks with sensor data to create efficiencies through lightning fast data processing in service of a wide range of applications like manufacturing automation, predictive maintenance and remote control of mobile assets.
For some time, large scale data analysis has been synonymous with centralized data centers capable of flexibly supporting a wide range of compute workloads. But that was before 5G. To meet the same goal in the context of sub-1 millisecond network latencies, centralized computing can’t keep up–it has to be augmented by compute functionality distributed out to the edge of the network, whether that’s defined as a client device, enterprise WAN or general purpose compute infrastructure co-located with cellular radio sites.
Given the industry and investment move toward the edge, major compute players like Nvidia and NXP are developing new products that supplement centralized compute with another layer of decentralized infrastructure. Drilling down another layer, real-time decision making supported by 5G and decentralized compute can’t be a manual process. Rather, artificial intelligence and machine learning algorithms are needed to analyze data and initiate an action in a manner that improves over time as the reference data set gets larger.
NXP, for instance, is working with Microsoft to integrate the former’s AI and ML tools with the latter’s Azure IoT cloud with a focus on “anomaly detection.” At a high-level, the solution set uses machine learning to establish what normal device behavior looks like, then uses edge and cloud compute to identify when devices are not behaving normally. In the context of a smart factory, for instance, fast detection of abnormal device behavior could be used to trigger an intervention thus avoiding lost productivity.
NXP’s Denis Cabrol, executive director and general manager of IoT and Security Solutions, said in a statement, “Preventing failures and reducing downtime are key to enhance productivity and system safety.” He said the collaboration with Microsoft is “part of our continued efforts to bring cognitive services down to the silicon.”
Projecting 150 billion data-producing IoT devices by 2025, Nvidia, last month at the annual Computex event in Taiwan, announced its EGX edge computing platform, optimized to use AI to take advantage of latency-dependent applications with particular call out of the telecom, healthcare, retail, manufacturing and transportation verticals. Here’s a rundown of the technicals of what Nvidia referred to as an “on-prem AI cloud in a box.”
Nvidia VP and GM of Enterprise and Edge Computing Bob Pette said the goal is to support enterprise digital transformation efforts by enabling “powerful computing at the edge to process…oceans of raw data–streaming in from countless interactions with customers and facilities–to make rapid, AI-enhanced decisions that can drive their business.”