Tech and energy giants are redrawing the map of power generation and delivery with the main aim of keeping up with the rising demands from AI — in this equation, liquid cooling technology plays a key role
As AI workloads continue to grow, the power behind the processing is becoming the real bottleneck. From liquid cooling breakthroughs to massive market forecasts, these announcements show how thermal management and power efficiency are becoming foundational to AI infrastructure.
LG Electronics targets AI data centers with advanced cooling tech
Korean tech giant LG Electronics is making a major push into the AI data center space with advanced thermal management systems based on its HVAC expertise. The company unveiled its Coolant Distribution Unit (CDU), which uses liquid cooling to directly target high-heat chips like GPUs and CPUs. The CDU incorporates AI-based virtual sensors for fault tolerance and an inverter-driven pump for on-demand coolant use, improving energy efficiency. The company is working on hybrid solutions combining air and liquid cooling, as well as immersion cooling, and has built a dedicated AI data center HVAC test bed in Pyeongtaek.
Takeaway: LG is positioning itself as a key player in the thermal backbone of AI infrastructure, leveraging its consumer HVAC expertise to serve the most demanding enterprise workloads.
Data center liquid cooling market to grow 5x by 2034
The global market for data center liquid cooling is projected to expand from $4.68 billion this year to $22.57 billion by 2034, growing at a CAGR of 19.1%. North America currently leads the market, driven by investments from hyperscalers like AWS and Google, but Asia-Pacific is expected to be the fastest-growing region, fueled by hyperscale projects in China, India and Japan. In the U.S., the market is forecast to grow from $1.09 billion in 2024 to $6.39 billion by 2034, supported by innovations such as immersion and direct-to-chip cooling.
Takeaway: With demand for AI-scale compute rising fast, liquid cooling is emerging as a critical enabler — and a $20B+ market opportunity over the next decade.
AEMEnergy launches ultra-compact liquid-cooled BESS
AEMEnergy unveiled an innovative energy storage solution — a silent, liquid-cooled battery energy storage system (BESS) that occupies just 1.68 square meters. The system integrates AEMEnergy’s proprietary silicon carbide (SiC) power conversion system, enabling seamless control and over 90% efficiency. Designed for space-constrained environments, the unit also features low-noise operation, redundant safety systems, and full digital integration (BMS, EMS, PCS, cloud).
Takeaway: Compact, quiet and digitally managed, this next-gen BESS shows how liquid cooling is expanding into energy storage to support distributed AI infrastructure.
Accelsius sets liquid cooling benchmark with 250kW AI rack
Accelsius achieved two thermal milestones with its NeuCool technology: successfully cooling 4,500W per GPU socket and maintaining safe GPU temperatures in a fully loaded 250kW AI rack using warm (40°C) facility water. The company’s two-phase CDU demonstrated exceptional thermal headroom and efficiency, even under extreme conditions.
Takeaway: Accelsius’ results raise the bar for liquid cooling, proving it can meet the demands of ultra-dense, high-power AI racks while cutting energy use — a critical step in scaling future infrastructure.
Big Picture
Tech and energy leaders are rapidly redrawing the global landscape of power generation, distribution and data center design to support AI’s explosive growth. Innovations in cooling, energy storage and hybrid grids are no longer optional — they’re becoming the backbone of next-gen infrastructure. As AI workloads scale, we’re seeing a wave of investment in high-efficiency systems and a surge in global competition to deliver faster, cleaner and more reliable power solutions.
What else is powering AI infra today?
NVIDIA brings AI supercomputer manufacturing to the U.S. for the first time
AI boom sparks energy bill concerns among U.S. lawmakers
TSMC fast-tracks advanced packaging for Google and NVIDIA AI chips
NSF injects $20 million into CloudBank to power AI and science research
Follow AI Infrastructure Insights on LinkedIn to get more AI infra briefs.