YOU ARE AT:AI InfrastructureAI infra brief: From NVIDIA, AT&T, Google and more

AI infra brief: From NVIDIA, AT&T, Google and more

In this regular update, RCR Wireless News highlights the top news and developments impacting the booming AI infrastructure sector.

NVIDIA brings agentic AI reasoning to enterprises with Google Cloud

NVIDIA said it is collaborating with Google Cloud to bring agentic AI to enterprises seeking to locally harness the Google Gemini family of AI models using the NVIDIA Blackwell HGX and DGX platforms and NVIDIA Confidential Computing for data safety.

With the NVIDIA Blackwell platform on Google Distributed Cloud, on-premises data centers can stay aligned with regulatory requirements and data sovereignty laws by locking down access to sensitive information, such as patient records, financial transactions and classified government information, the company said. NVIDIA Confidential Computing also secures sensitive code in the Gemini models from unauthorized access and data leaks, it added.

Confidential computing with NVIDIA Blackwell provides enterprises with the technical assurance that their user prompts to the Gemini models’ application programming interface — as well as the data they used for fine-tuning — remain secure and cannot be viewed or modified. At the same time, model owners can protect against unauthorized access or tampering, providing dual-layer protection that enables enterprises to innovate with Gemini models while maintaining data privacy, NVIDIA added.

This new offering arrives as agentic AI is transforming enterprise technology, offering more advanced problem-solving capabilities. Unlike AI models that perceive or generate based on learned knowledge, agentic AI systems can reason, adapt and make decisions in dynamic environments, said NVIDIA.

AT&T to provide fiber connectivity for Jericho Energy Ventures’ AI modular data center site

Jericho Energy Ventures (JEV) has partnered with AT&T to install a minimum of 10Gbps of fiber optic at its initial Modular High Performance AI Data Center site in Oklahoma.

On March 31, 2025, JEV unveiled its modular data center venture, harnessing its natural gas assets and infrastructure to drive the development of advanced, technology-driven, AI-focused computing solutions tailored for the AI era.

Brian Williamson, CEO of Jericho Energy Ventures, commented: “We are moving full steam ahead in building out our AI modular data centers, and partnering with industry-leader AT&T along with others to deploy high-speed fiber connectivity on-site is a critical step in developing a next-generation modular AI computing infrastructure. By leveraging our natural gas assets and strategic locations, we are uniquely positioned to provide scalable, reliable, and cost-effective power solutions to meet the growing demands of the AI age.”

Google unveils new TPU for inference AI

At Google Cloud Next 2025, Google introduced Ironwood, its seventh-generation Tensor Processing Unit (TPU) and the first designed specifically for AI inference. The new TPU, dubbed Ironwood, supports large-scale, thinking AI models like LLMs and Mixture of Experts (MoEs) that demand massive computation and communication capabilities.

Google noted that Ironwood can scale up to 9,216 liquid-cooled chips, delivering 42.5 Exaflops per pod. Each chip achieves 4,614 TFLOPs of peak compute, powered by 192 GB of High Bandwidth Memory (HBM) and 7.2 TBps of bandwidth, enabling efficient handling of large datasets and models. Its enhanced Inter-Chip Interconnect (ICI) reaches 1.2 Tbps, enabling low-latency communication for distributed AI workloads.

The TPU includes an improved SparseCore for ultra-large embeddings and supports Pathways, Google’s ML runtime for scaling across pods. Ironwood’s liquid cooling ensures energy efficiency, offering 2x the performance per watt of its predecessor, Trillium, and 30x efficiency over the original Cloud TPU.

Why these announcements matter

These announcements highlight key trends shaping the future of AI infrastructure. Google’s Ironwood TPU, built specifically for inference, and NVIDIA’s push for agentic AI mark a shift toward models that don’t just process information, but reason and make decisions in real time. At the same time, both companies are emphasizing secure AI deployments — with technologies like Confidential Computing and compliance-ready cloud infrastructure — to protect sensitive data and model integrity. Jericho Energy Ventures’ modular, AI-focused data centers powered by natural gas and supported by fiber from AT&T reflect the growing need for localized, energy-efficient infrastructure. Together, these developments signal an evolution toward secure, inference-ready and highly scalable AI systems designed to meet the demands of the next generation of intelligent applications.

Follow AI Infrastructure Insights on LinkedIn to get more AI infra briefs.

ABOUT AUTHOR

Juan Pedro Tomás
Juan Pedro Tomás
Juan Pedro covers Global Carriers and Global Enterprise IoT. Prior to RCR, Juan Pedro worked for Business News Americas, covering telecoms and IT news in the Latin American markets. He also worked for Telecompaper as their Regional Editor for Latin America and Asia/Pacific. Juan Pedro has also contributed to Latin Trade magazine as the publication's correspondent in Argentina and with political risk consultancy firm Exclusive Analysis, writing reports and providing political and economic information from certain Latin American markets. He has a degree in International Relations and a master in Journalism and is married with two kids.