China plans to triple AI-chip output in 2026 to curb Nvidia reliance — FT

International Desk — August 30, 2025
China is preparing a sweeping push to manufacture far more of the chips that power artificial intelligence, aiming to triple its domestic AI-processor output in 2026 and reduce dependence on Nvidia, according to a Financial Times report echoed by Reuters. The plan, which relies on new fabrication capacity tied to Chinese tech champions, marks Beijing’s most aggressive attempt yet to secure the hardware at the heart of today’s AI race. Financial TimesReuters
At the center of the story is Huawei and a cluster of new facilities designed to build its Ascend-series AI chips. People familiar with the effort told the FT that one plant is expected to start production by the end of this year, with two additional fabs slated for 2026. The ownership of the sites is murky—Huawei told the newspaper it has no plans to own its own fabs—but the combined capacity could surpass output at SMIC, China’s largest foundry, if the ramp hits targets. For a country shut out of the most advanced Western lithography, the goal isn’t parity overnight; it’s a fast, practical climb out of dependence. Financial TimesReuters
Reuters adds that SMIC itself plans to double production of 7-nanometer-class chips next year, with Huawei as a top customer—another sign that Chinese suppliers are aligning around a domestic stack even as they lag the cutting edge used by U.S. giants. The same reporting underscores the immediate catalyst: Washington’s export controls have throttled access to Nvidia’s most capable parts, creating a scramble for viable alternatives that can train and run large AI models at scale. Reuters
That scramble has lately taken a more nuanced turn. The United States has begun issuing licenses for Nvidia to sell its China-specific H20 accelerators, loosening a ban that hit this summer. Even so, H20 sits below Nvidia’s flagship chips in capability, and approvals are not blanket guarantees; shipments depend on license scope and customer lists. The net effect is an unstable supply line for Chinese buyers—good enough to keep some projects moving, not reliable enough to build a national strategy on. Hence Beijing’s push to stand up its own capacity. South China Morning Post
The ecosystem is also expanding beyond Huawei. Alibaba has developed a new domestically manufactured AI chip aimed at inference workloads, part of a broader move by Chinese cloud providers to fill the Nvidia gap with homegrown silicon. Analysts see similar momentum at Cambricon and other local designers, with government funding and stock-market enthusiasm providing fresh oxygen. While these products still trail Nvidia for frontier model training, they’re beginning to look useful for the everyday work of running AI services—where cost, availability and integration often matter more than raw peak performance. ReutersFinancial Times
For Nvidia, the signals are mixed. On one hand, the company continues to post historic sales on the back of global AI demand. On the other, China remains a question mark, with policy-driven uncertainty showing up in recent market jitters and earnings-day commentary. That is the context in which the FT’s “triple output” headline lands: China isn’t counting on Washington’s mood swings; it is trying to build enough capacity to keep training schedules intact without relying on a rival’s permission slip. Financial Times
How quickly can this plan close the gap? Even optimists concede that making competitive AI accelerators at scale is about more than etching transistors. It requires reliable packaging, high-bandwidth memory, fast interconnects, and a mature software stack. That last piece has been a quiet focus for Beijing. The FT notes that Chinese model developers are coalescing around shared formats and toolchains—an attempt to replicate the tight hardware-software alignment that powered Nvidia’s rise. If local chips can run popular frameworks efficiently and cluster together in large numbers without falling over, “good enough” may prove good enough for much of the domestic market. Financial Times
Still, execution risk looms everywhere. New fabs are expensive and slow to yield; qualified engineers are scarce; and every step up the performance ladder attracts fresh scrutiny from overseas regulators. Licenses for H20 could ease near-term pressure but won’t end the strategic race. Meanwhile, China’s internet giants face their own trade-offs: whether to back Huawei’s stack deeply, buy from rival domestic designers, or hedge with licensed Nvidia parts where possible. Each path touches politics, partnerships, and platform choices that will last years.
For global AI builders, the practical takeaway is straightforward. Compute supply is fragmenting. If China succeeds at tripling output, its tech sector will have a steadier (if not bleeding-edge) source of accelerators for training and, especially, inference. That would lessen demand for limited H-class and Blackwell inventory and redirect some orders to Chinese fabs—without meaningfully denting Western hyperscalers’ appetite. For investors, the milestone to watch isn’t the press releases about ribbon-cuttings, but how quickly real clusters come online and how many commercially relevant models they can support at a competitive total cost.
The next twelve months will tell us whether this is a political headline or an industrial pivot. If the late-2025 and 2026 ramps materialize on schedule—and if software keeps pace—China will have turned an export-control squeeze into a manufacturing sprint. If delays pile up, licenses narrow, or domestic chips miss performance targets, the old reliance on imported GPUs will persist under a new set of workarounds. Either way, the direction of travel is clear: less dependence on Nvidia, more domestic control over the AI stack, and a more multipolar market for the world’s most coveted compute. Financial TimesReuters
Reporting based on the Financial Times’ original report and corroborating coverage by Reuters and others, current as of August 30, 2025. Financial Times+1Reuters+1South China Morning Post
