The AI chip logic is too hard! The performance of Asmack and TSMC is a big advantage for Nvidia (NVDA.US)

Zhitongcaijing · 10/17 14:49

Under the fervor wave of AI deployment around the world, TSM.US (TSM.US), which has the title of “the king of chip foundry,” announced an extremely strong third quarter results report. It can be said that the entire chip sector “took flight” in the US stock market. On Tuesday, chip stocks collectively suffered a severe thunderstorm due to an unexpected thunderstorm in the performance of lithography giant ASML.US (ASML.US). At the beginning of the US stock market on Thursday, TSMC's US stock ADR soared by more than 12%, while the Philadelphia Semiconductor Index, which has the nominal name for chip stocks, rose nearly 3%. Nvidia, the AI chip hegemon, hit a record high, driven by TSMC's strong earnings report.

Although Asmack's performance has exploded, from a rational investment perspective, Asmack's performance, which hit global chip stock prices, does not mean that the fanatical wave of global artificial intelligence layout is dissipating or cooling down, and Asmack's performance shows that demand for AI chips continues to surge. However, this thunderous financial report has indeed revealed the latest developments in the global chip industry. That is, the AI boom is still ongoing. In particular, demand for all types of AI chips focusing on B-side data centers is still very hot, but in fields unrelated to AI, demand is still weak or even declining sharply.

Asmack's chief financial officer Roger Dassen can be said to have supported this market view in his performance statement. The Asmack executive said that demand for chips related to artificial intelligence is indeed surging, but demand recovery in other parts of the semiconductor market is weaker than we expected, causing some logic chip manufacturers to delay lithography machine orders.

However, TSMC's latest performance can be said to have greatly strengthened the investment view that the AI boom is still in full swing, and demand for AI chips is still extremely hot. Speaking about the market demand for AI chips, Wei Zhejia, the leader of TSMC, said at the performance conference that the demand for AI chips is very optimistic, and emphasized that TSMC customers' demand for advanced CoWOS packages far exceeds the company's supply.

“The company will fully respond to customers' demand for CoWoS advanced packaging production capacity. Even if production capacity doubles this year and continues to double next year, it will still be far from enough. ” Wei Zhejia said at the performance conference. CoWoS advanced packaging production capacity is critical to the production capacity of a wider range of AI chips, such as Nvidia Blackwell AI GPUs. “Almost all AI innovators have partnered with TSMC. The demand for AI is real, and I believe this is just the beginning.”

TSMC management expects the company's annual revenue to increase by nearly 30%, exceeding the general expectations of 20%-25% of analysts and the company's previous quarter. Management also expects revenue related to TSMC's data center artificial intelligence server chips (including Nvidia AI GPUs, Broadcom AI ASICs, etc.) to more than triple this year.

After Asmack's performance exploded, some analysts even shouted that even if Asmack's performance was so unimpressive, it actually favors the stock price trend of “AI sellers” such as Nvidia. After all, Asmack's performance shows that demand for AI chips in data centers is still booming. Judging from the stock price trend, Asmack's stock price, which has grasped the lifeblood of chip production capacity, can be said to have drastically outperformed Nvidia. Unwittingly, the market has already used real money to answer who is the biggest winner in chip stocks.

Therefore, the performance of the two core giants in the chip industry chain, Asmack and TSMC together showed that the stock logic support closely related to AI chips can be described as extremely hard core. The stock price rise of AI chip leaders such as Nvidia is probably far from stopping. In particular, the stock price of Nvidia, which dominates AI chips with an 80%-90% share in the data center AI chip field, may continue to hit record highs. Breaking through the $150 commonly expected by analysts may only be a matter of time.

The AI boom continues to sweep the world

Asmack's latest performance results show that the fate of global chip companies is clearly divided: AI applications such as ChatGPT and Sora have surged demand for data center server-side AI chips that can handle parallel computing models at the sky level and matrix computation with high computational density, such as Nvidia AI GPUs, which have overshadowed the extremely sluggish demand in other segments of the industry.

Jefferies analyst Janardan Menon from Wall Street said in a report on Wednesday: “Asmack's earnings report shows that while demand for chips related to artificial intelligence remains very strong, recovery in other sectors is lagging behind. This trend is likely to continue until 2025.”

Global demand for AI chips can be clearly seen from South Korea's chip inventory and South Korea's chip export scale. South Korea is home to two of the world's largest memory chip manufacturers, SK Hynix and Samsung.

According to data released by the Korean government, despite the slowdown in growth, semiconductor exports in September still increased sharply by 37% year on year, slightly weaker than the 38.8% increase in August. In the continuously growing chip export data, up to one-third of the increase contributed by HBM storage systems, HBM storage systems and the core hardware provided by the AI chip leader Nvidia — the H100/H200/GB200 AI GPU and a wide range of AI ASIC chips (such as Google TPU). HBM is used in conjunction with it. Also, AI GPUs are essential to drive major artificial intelligence applications such as ChatGPT and Sora. Stronger demand for HBM indicates that demand for AI chips is becoming more intense.

Demand for AI chips is currently extremely strong, and this will probably be the case for a long time to come. According to data recently released by the Semiconductor Industry Association (SIA), driven by strong demand for AI chips, global semiconductor sales reached about US$53.1 billion in August 2024, up 20.6% from US$44 billion in August 2023, and 3.5% month-on-month compared to the already strong sales of US$51.3 billion in July.

1729175816 (1) .png

AMD CEO Su Zifeng recently said at a new product launch that demand for data center AI chips, including AI GPUs, has far exceeded expectations, and it is expected that by 2027, the data center AI chip market will reach 400 billion US dollars and further rise to 500 billion US dollars in 2028, which means that the compound annual growth rate of the global data center AI chip market is expected to exceed 60% from 2023 to 2028.

Bain, a world-renowned strategy consulting firm, predicts that as the rapid spread of artificial intelligence (AI) technology disrupts businesses and the economy, the size of all artificial intelligence-related markets is expanding and will reach 900 billion US dollars by 2027. The consulting firm indicated in its fifth annual “Global Technology Report” released on Wednesday that the overall AI market size, including artificial intelligence-related services and basic core hardware such as AI GPUs, will grow 40% to 55% each year from last year's $185 billion. This means that by 2027, it will bring in huge revenue of 780 billion to 900 billion US dollars.

1728449771 (1) .png

Wall Street commercial banking giant Bank of America (Bank of America) recently released a research report saying that the global artificial intelligence boom is still in its infancy. It is basically similar to the Internet development path in the 1990s. It can be compared to the “1996 moment” when the Internet was booming, which means that according to the Bank of America analysis team, the AI boom is still in a very early stage.

Nvidia shares hit $150

Nvidia's stock price, known by Goldman Sachs as the “Earth's most important stock,” once held the position of “the world's highest listed company by market capitalization” this year, but in the second half of this year, it fell back due to vague prospects for AI monetization and major shocks in global macroeconomic policies, and the stock price plummeted for a while.

Recently, when showcasing their business progress, many technology companies around the world couldn't get around Nvidia's most advanced AI GPU server system. Coupled with Wall Street businesses' optimistic predictions about data center spending and Nvidia's stock price, they jointly pushed Nvidia's stock price to break through the previous high in the US stock market on Thursday, hit a new all-time high of $140.89, and is expected to once again impact the title of “highest listed company by market capitalization.”

According to the latest forecast data from Wall Street financial giant Citigroup, capital expenses related to data centers of the four largest US tech giants are expected to increase by at least 40% year over year by 2025. These huge capital expenses are basically linked to generative artificial intelligence, which means that AI applications such as ChatGPT still require huge computing power. Citigroup said this means that giants' spending on data centers is expected to continue to expand significantly beyond the already strong 2024 spending scale. The agency expects this trend to continue to bring a very significant positive catalyst to the stock prices of Nvidia, the well-deserved AI GPU hegemon, and data center interconnect (DCI) technology providers.

The four tech giants mentioned by Citi in the research report are the global cloud computing giants Amazon, Google, and Microsoft, plus the parent companies of social media Facebook and Instagram. In this newly released research report, Citi predicts that by 2025, the data center capital expenditure of the four major tech giants will increase by 40% to 50% year-on-year. The huge increase in data center spending by tech giants is expected to drive the stock prices of data center network technology giants such as Nvidia and Arista Networks, which can be called “sellers” in the field of AI infrastructure, to continue to be favored by international capital.

The Citigroup analysis team said in the latest research report that Nvidia's absolute leading position in terms of total cost of ownership (TCO) and return on investment (ROI) in the AI infrastructure field is emphasized as a core focus factor for data center operators. They value the higher level of efficiency of running various applications (including AI training/inference applications) on Nvidia's hardware and CUDA collaborative acceleration software platforms.

CUDA's ecological barrier can be described as Nvidia's “strongest moat”. Nvidia has been deeply involved in the field of global high-performance computing for many years. In particular, the CUDA computing platform built by itself is popular all over the world. It can be described as the preferred software and hardware collaboration system in high-performance computing fields such as AI training/inference. The CUDA accelerated computing ecosystem is a parallel computing acceleration platform and programming aid software developed exclusively by Nvidia. It allows software developers and software engineers to use Nvidia GPUs to accelerate parallel general computing (only Nvidia GPUs are supported; they are not compatible with mainstream GPUs such as AMD and Intel).

The Citigroup analytics team recently reiterated the agency's target price of up to $150 for Nvidia within 12 months, as well as a “buy” rating. According to data compiled by TIPRANKS, 42 Wall Street analysts expected Nvidia's average price target price of $152.86 within 12 months, which means the potential upside is close to 10%.

1729175867 (1) .png

Wedbush, a well-known investment agency on Wall Street, recently released a research report saying that the top three tech giants in the US stock market — Apple (AAPL.US), Nvidia (NVDA.US), and Microsoft (MSFT.US) — are expected to reach a total market value of 4 trillion US dollars within the next six to nine months.

Wedbush analysts led by Daniel Ives have made some bold and optimistic predictions about the future of AI spending and infrastructure. Ives said in an investor report: “We believe that with the establishment of next-generation AI infrastructure, the size of the entire artificial intelligence infrastructure market dominated by Nvidia's AI GPUs is likely to increase dramatically by 10 times between today and 2027. We estimate that capital spending on artificial intelligence will reach $1 trillion over the next three years.”