The boss of Nvidia, the company powering today’s artificial intelligence boom, has broken with tech’s usual polite script and voiced a blunt fear: the United States may design the smartest chips, but China is quietly building the power plants, cables and data centres needed to run them at scale.
Jensen Huang’s stark warning on america’s AI ambitions
During a recent appearance at the Center for Strategic and International Studies (CSIS) in Washington, Nvidia CEO Jensen Huang painted a vivid picture of an AI race running on two very different tracks.
On one side, the US controls the intellectual heavyweights: cutting‑edge chip design, frontier AI models and the most valuable software ecosystems. On the other, China is pouring concrete, stringing power lines and spinning up massive data halls at a speed that American regulators can barely process.
Huang contrasted a three‑year build time for a US data centre with China’s ability to erect major facilities, like hospitals, in days rather than years.
His point was not nostalgia for authoritarian efficiency. It was a blunt assessment of bottlenecks. In his view, the US is increasingly blocked not by a lack of talent or capital, but by bureaucracy, planning delays and sluggish logistics.
He is far from alone. Investors and energy experts have been sounding similar alarms: planning permission takes months or years, grid upgrades crawl through red tape, and data‑centre projects are now routinely stalled by local opposition.
Energy: the quiet battlefield for AI power
Huang’s deepest concern lies in something most AI discussions barely mention: electricity. Every large language model, every generative image, every recommendation algorithm burns power. Scaling them up means scaling up national grids.
According to Huang, China already has roughly double the power generation capacity of the US, despite having a smaller nominal economy. That gap is widening as Beijing rushes out new power plants and high‑voltage lines while many US regions argue over permits and land use.
He described Chinese capacity as “soaring”, while US capacity remains largely flat at the very moment AI is demanding a step‑change in supply.
➡️ Why budgeting works best when it adapts to real life
➡️ Why adding mulch around trees conserves water in dry summer gardens
➡️ Scientists warn that this natural cycle is becoming increasingly unpredictable
➡️ Scientists found evidence of a second magnetic field surrounding our planet
Kevin O’Leary, the businessman and investor, recently drew attention to another sticking point: permits. In the US, energy and industrial projects often wait six to 18 months just for basic approvals. In China, new industrial parks, power plants and data hubs are appearing on a monthly cycle.
The result is a strange imbalance. America builds extraordinary chips like Nvidia’s H100 or B200, but then struggles to find enough reliable, affordable electricity to run them. Without that power, the most advanced GPUs are little more than very expensive metal boxes sitting in warehouses.
Why AI is such an energy hog
Training a top‑tier AI model can consume as much electricity as a small town uses in a year. Running that model for millions of users then adds a permanent power load on top.
- Data centres require continuous electricity, 24/7, with minimal outages.
- Cooling systems must remove vast amounts of heat from racks of GPUs.
- New AI workloads, like real‑time assistants, increase energy demand per user session.
- Electricity costs directly shape where companies decide to place new AI clusters.
Countries that can offer cheap, stable power and fast permitting have a structural advantage in hosting the next wave of AI infrastructure.
A “five‑layer cake” where china owns the base
Huang has described the AI competition as a “five‑layer cake”. At the top are the glamorous, marketable elements; at the bottom, the industrial plumbing that no one wants to pay for but everyone needs.
| Layer | What it covers | Current strength |
|---|---|---|
| 1. Energy | Power plants, grid capacity, transmission | China scaling faster |
| 2. Chips | Design of GPUs, accelerators, advanced semiconductors | US lead via Nvidia and others |
| 3. Infrastructure | Data centres, cooling, fibre networks | China building at pace |
| 4. Models | Foundational AI systems, LLMs, training stacks | US ahead but with rising Chinese contenders |
| 5. Applications | Software, services, open‑source tools on top of models | Both sides active; China aggressive with open source |
Huang argues that while US firms dominate chip design and leading models, China is busy locking down the lower layers: energy, physical infrastructure and mass deployment, often through open‑source AI frameworks.
That matters because those base layers are not easily or quickly replicated. A new chip design can appear in two or three years; a new nuclear plant or transmission corridor can take a decade in the US regulatory environment.
Nvidia’s awkward position between washington and beijing
Nvidia itself stands in an uncomfortable spot. It is a US company, shaped by American capital and regulation, but it has historically relied heavily on Chinese customers for growth.
Export controls from Washington now restrict the advanced GPUs Nvidia can sell into China. In response, the company has tried to design “watered‑down” chips tailored to fit within US rules while still serving Chinese cloud providers and research labs.
Huang has called China the world’s second‑largest tech market and made clear that being largely shut out is painful, both commercially and strategically. He also pushed back against the assumption that China cannot catch up in advanced manufacturing.
He warned that anyone who assumes China will remain permanently dependent on Western chip makers is missing a crucial piece of the picture.
China has already demonstrated it can climb manufacturing ladders at speed, from solar panels to batteries to smartphones. A determined push into mature‑node chips tailored for AI workloads would not surprise industry veterans.
Can “reindustrialisation” save the US lead?
Huang’s message to US policymakers is pointed: if America wants to keep its AI edge, it must rebuild its industrial base, not just its app ecosystem.
That means more than subsidising fabs through measures like the CHIPS Act. It requires turning data centres and grid upgrades into strategic infrastructure, treated as seriously as roads, ports or military bases.
Some in Washington, including figures around Donald Trump, are pushing to classify data centres and critical AI facilities as infrastructure of national importance. That could streamline permitting and unlock public–private partnerships for new energy projects.
Yet any such push runs into local politics, environmental concerns and fragmented regulation across states. Communities want jobs and investment, but they also worry about water use, noise, land prices and emissions from new power plants.
What this AI power struggle means for everyday users
This high‑level tug‑of‑war may sound remote, but it filters down into everyday technology. The location of AI infrastructure affects speed, reliability and sometimes the cost of services used by people and businesses worldwide.
If Chinese data centres end up hosting more of the world’s AI training, Chinese companies can iterate faster on new models and features. If US infrastructure lags, American firms might face higher operating costs or be forced to cap certain services, especially those that are power‑intensive.
There is also a geopolitical angle: the country that runs the most AI compute can set de facto standards for safety, content filtering, data handling and surveillance capabilities. That shapes not only technology, but trade, diplomacy and civil liberties.
Three scenarios for the next decade
- US breaks the bottlenecks: Congress and regulators fast‑track energy and data‑centre projects. Grid upgrades accelerate, nuclear and renewables expand, and the US keeps both the chip and infrastructure lead.
- China builds the AI “engine room”: Export controls slow Chinese chip access, but not enough to stop domestic alternatives. Backed by cheap power and rapid construction, China becomes the main factory for global AI compute.
- Fragmented AI blocs: Both sides clamp down, building parallel systems. A US‑aligned cluster and a China‑aligned cluster emerge, with countries pressured to choose where their data and AI workloads live.
Businesses planning long‑term AI strategies increasingly have to model these scenarios. Where they place their data, who supplies their chips and energy, and which jurisdictions host their training runs can change their costs and legal risks for years.
Key terms worth unpacking
For readers trying to navigate this debate, a few technical concepts keep cropping up:
- GPU (graphics processing unit): Originally built for gaming, these chips handle many calculations in parallel. That trait makes them ideal for training AI models, where millions of parameters are updated at once.
- Data centre: An industrial‑scale facility housing servers, storage and networking gear. Modern AI data centres pack thousands of GPUs into tightly controlled environments, with sophisticated cooling and security.
- Grid capacity: The total amount of electricity a region’s power system can reliably deliver. AI growth strains capacity, especially during peak hours or in regions already facing shortages.
- Open source in AI: Releasing model code and sometimes weights freely, allowing anyone to build on top. China has embraced this both to sidestep Western licensing limits and to stimulate fast local innovation.
Huang’s comments land at the intersection of all these forces. They suggest the AI race is less about who writes the cleverest algorithm this year and more about who can sustain industrial‑scale computing for decades. Chips are glamorous; transformers and datasets get headlines. The less glamorous parts — bolts, wires, transformers and zoning laws — may decide who actually wins.