Demand for AI computing power continues to expand at a blowout: Nvidia's supply continues to be anxious, Google TPU leads ASIC and then takes the lead

Zhitongcaijing · 06/30 12:49

The Zhitong Finance App learned that according to J.P. Morgan's latest CIO survey report, AI will be a key investment area in the next three years: about 68% of respondents plan to invest more than 5% of their budget in this field, which is currently about 25%.

The survey covered 168 chief information officers who manage about $123 billion in annual corporate IT spending. The two core issues directly related to semiconductor spending include: 1) the share of current organizational IT budgets for AI/accelerated computing for local hardware and cloud services; 2) the current share of public cloud spending in IT budgets and their expected values. The survey results confirm continued strong cloud capital expenditure growth expectations driven by AI/accelerated computing data-driven spending.

1751287028 (1) .png

The survey results provide several key points, which Xiaomo believes support continued strong cloud capital expenditure growth driven by AI accelerated computing data center construction. First, the feedback shows a strong focus on AI over the next three years — 68% of respondents plan to spend more than 5% of their IT budget on AI computing hardware within the next three years. Second, the intention for AI-related computing spending as a percentage of the CIO's IT budget is expected to increase to 15.9% over the next three years, compared to about 5.9%, which means a compound annual growth rate of 41%, which is better than Xiaomo's XPU (GPU, TPU, DPU) semiconductor revenue growth forecast of 30-35%.

1751287008 (1) .png

Finally, in addition to AI spending, cloud spending is expected to increase 38% from the current 25% over the next five years, with a compound annual growth rate of 9-13%.

According to Komo, the low double-digit compound annual growth rate of cloud spending from large enterprise customers shows that the continued healthy business background of cloud service providers (AI and general computing workloads) helps drive the sustainability of confidence in cloud service provider capital expenditure growth.

AI computing power demand can be called a “sea of stars”, and AI GPU and AI ASIC dual routes will benefit together

In the short term, companies said IT spending will be more cautious in the second half of this year due to tariffs/trade and geopolitical developments (some delayed in the second half of the year).

Overall, this is consistent with Komo's view that tariff/trade related developments may drive a more seasonal pattern of the company's demand in the second half of the year.

In the medium to long term, the survey results support Komo's view on the strong growth of AI infrastructure over the years, and continues to be optimistic about AI beneficiary companies Nvidia (NVDA.US), AMD (AMD.US), Broadcom (AVGO.US), and ARM Continued strong revenue growth prospects for companies such as (ARM .US).

Furthermore, AI ASIC with TPU as the core recently received an endorsement from OpenAI, strengthening Google's leading position in the ASIC ecosystem. Another major Wall Street bank, Morgan Stanley, pointed out that this partnership can drive Google Cloud's revenue growth to accelerate (not fully reflected in stock prices) and attract customers such as Apple and Cohere to migrate, and Compute TAM (addressable market) is expected to improve.

Nvidia's production capacity constraints and the supply side continue to show a situation where supply is in short supply, creating historic demand opportunities for ASICs. AI ASIC ecosystems such as Google's TPU are expected to use OpenAI to achieve ecological breakthroughs, indicating that ASIC demand “will later rise and become stronger.”

As US tech giants insist on investing heavily in the field of artificial intelligence, the biggest winners may include not only “AI chip hegemon” Nvidia, but also AI ASIC giants such as Broadcom, Mywell Technology, and Shixin from Taiwan. Microsoft, Amazon, Google, Meta, and even generative AI leader OpenAI are all teaming up with Broadcom or other ASCI giants to update iterative AI ASIC chips for massive inference side AI computing power deployment. Therefore, the future market share expansion trend of AI ASICs is expected to be much stronger than AI GPUs, and will tend to have equal shares, rather than the current situation where AI GPUs are alone — accounting for up to 90% of the AI chip field. The four major US tech giants are expected to spend as much as 330 billion US dollars on AI computing power in 2026, which means they are expected to increase by nearly 10% from the record scale this year.

At the Nvidia performance conference at the end of May, Hwang In-hoon was extremely optimistic that the Blackwell series would set the strongest AI chip sales record in history, driving the AI computing power infrastructure market to “show exponential growth.” “Today, every country sees AI at the core of the next industrial revolution — an emerging industry that continuously produces intelligence and critical infrastructure for every economy in the world,” Hwang In-hoon said in a performance discussion with analysts.

The demand for AI computing power brought about by the inference side can be called a “sea of stars”, which is expected to drive the AI computing power infrastructure market to continue to show exponential growth. “AI inference systems” are also the largest source of future revenue for Nvidia Hwang.

As DeepSeek R1, which was heavily launched by DeepSeek, continues to be popular around the world, and the NSA mechanism shown in a DeepSeek study achieved revolutionary training and inference efficiency improvements for AI models at the bottom of the Transformer, triggering global AI model developers to follow this “extremely low cost AI model computing power paradigm”, thereby comprehensively driving AI application software (especially generative AI software and AI agents) to accelerate penetration into all walks of life, completely revolutionizing the efficiency of various business scenarios and greatly increasing sales and AI chip demand. The future may show exponential growth rather than the “DeepSeek shock wave” previously anticipated by the market, triggering a cliff-style decline in AI chip demand.

Microsoft CEO Nadella previously mentioned the “Jevans Paradox” — when technological innovation dramatically increases efficiency, resource consumption not only does not decrease, but instead surges. Transferring to the field of artificial intelligence computing power is an unprecedented demand for AI inference computing power brought about by the surge in the scale of AI model applications.

Nvidia hits $5 trillion in market capitalization

Loop Capital, a well-known investment agency on Wall Street, released a research report last week saying that Nvidia's market value may soon reach 6 trillion US dollars, benefiting from the long-term global AI infrastructure arms race. The agency drastically raised Nvidia's target share price from 175 US dollars to 250 US dollars, far higher than the previous high target price on Wall Street of 200 US dollars given by Rosenblatt, which means that according to Loop Capital, Nvidia's market value will rise above 5 trillion US dollars in market value, reaching an all-time high of about 6 trillion US dollars in market value.

Loop Capital wrote in the research paper: “Our research shows that we are entering the next 'golden wave' of artificial intelligence applications, and Nvidia is still at the forefront of the next critical phase where demand is far greater than expected.” The agency predicts that by 2028, the cumulative expenditure of global cloud computing giants+technology enterprises+sovereign AI on Nvidia AI GPUs will be about 2 trillion US dollars.