The storage “supercycle” logic has been strengthened again! SSD leader SanDisk (SNDK.US) operating profit surged 878%

Zhitongcaijing · 11/07/2025 00:24

The Zhitong Finance App learned that SanDisk (SNDK.US), the global leader in SSD storage products, announced the results report for the first fiscal quarter of fiscal year 2026 ending October 3 on Friday morning Beijing time. Financial data showed that SanDisk Q1's core performance indicators and future performance prospects far exceeded the average expectations of Wall Street analysts. SanDisk's performance can be described as a comprehensive strengthening of the so-called “storage industry supercycle” led by the three major memory chip giants — Samsung, SK Hynix and Micron Technology, and Western Digital and Seagate, highlighting the continuous blowout of global AI training/inference computing power demand and the consumer electronics demand recovery cycle driven by the end-side AI boom all driving the expansion of demand for DRAM/NAND series storage products. In particular, demand for enterprise-level SSDs in the DRAM segment is surging demand for HBM storage and NAND in the DRAM segment.

In February 2025, the storage product Super Big Mac Western Digital (WDC.US) completed the split of the flash memory business. Its SanDisk (SanDisk) re-operated as an independent company, mainly undertaking NAND Flash chip and SSD storage products. SSD is a solid-state drive storage product with NAND Flash as the main storage medium. Externally, it appears as a “disk” and can be directly installed in computers, servers, and data centers. SSD shows an “oligopolistic competition pattern”. SanDisk's market size is second only to Samsung, SK Hynix, and Micron. Since the company successfully completed its spin-off from Western Digital in 2025, thanks to a surge in storage demand driven by an unprecedented wave of AI, SanDisk's stock price has risen insanely by nearly 500%. After the announcement of strong results, SanDisk's stock price once soared by more than 10% after the US stock market.

After the AI training/inference system pushes the performance of “computing power devices” (GPU/HBM) to the extreme, the real bottleneck lies in “feeding these GPUs/HBMs with the right model and fast enough”, and enterprise-grade SSDs are the optimal solution for this layer of “number feeding systems.” Driven by the AI wave, SanDisk's Q1 performance data as of October 3 showed that the company's total revenue increased 23% year over year and 21% month over month to US$2.31 billion, better than the average estimate of Wall Street analysts of US$2.1 billion; the company's adjusted earnings per share under non-GAAP standards were US$1.22, far higher than Wall Street analysts' expectations of $0.29 and $0.89 in the previous quarter.

In terms of broader performance indicators, SanDisk Q1's operating profit surged 878% month-on-month, with non-GAAP operating profit of about US$245 million, a sharp increase of 145% over the previous quarter; the company's Q1 net profit was about US$112 million, compared with a loss of 23 million US dollars in the previous quarter. The net profit under SanDisk Q1 non-GAAP was about US$181 million, a sharp increase of 331% over the previous quarter. SanDisk Q1's gross profit margin was around 29.8%, up from 26.2% in the previous quarter.

1762473091 (1) .jpg

SanDisk achieved a 26% month-on-month increase in data center business revenue in Q1. This is mainly due to SanDisk's current NAND supply qualifications for the two hyperscale data center operators, and plans to cooperate with a third hyperscale data center and a large-scale storage OEM in 2026. In addition, SanDisk also conducted in-depth negotiations with five large hyperscale data center operators.

In terms of performance forecasts focused on by global investors, SanDisk management expects the revenue range for the second fiscal quarter of FY2026 to be between US$2.55 billion and US$2.65 billion, higher than the Wall Street average forecast of about US$2.36 billion; SanDisk expects adjusted earnings per share under NON-GAAP guidelines for the second fiscal quarter to be between US$3.00 and US$3.40, far higher than the average expectation of Wall Street analysts of $1.82. The latest performance outlook that exceeds expectations highlights that the big wave of AI is driving actual demand and product prices for high-capacity enterprise-grade SSDs into a new upward cycle.

On Wall Street, the sentiment of bullish storage giants is heating up. Wall Street analysts generally believe that the AI bubble is currently in its earliest stages of molding, far from the frightening “moment when the bubble bursts,” and before the AI bubble bursts, DRAM and NAND storage giants will be one of the biggest beneficiaries of the AI wave. The “bull market” of these storage giants is far from over. Morgan Stanley recently reaffirmed its “overholding” rating for SanDisk shares, while also drastically raising the target price from 95 US dollars to 230 US dollars within 12 months. Another major Wall Street bank, Bank of America, also maintained an “increase in holdings”, and the target price was raised sharply from 125 US dollars to 230 US dollars. As of Thursday's US stock close, SanDisk's stock price closed at $207.69.

Enterprise-grade SSDs — one of the biggest beneficiaries of the AI wave

Looking at the product and customer pipeline, SanDisk's layout on the AI/data center eSSD (enterprise SSD) circuit is becoming more aggressive, and it has begun to translate into actual order and revenue growth contributions. In terms of data centers, the company emphasized in its performance report that it is advancing multiple hyperscale cloud & storage OEM certifications for UltraQLC high-capacity enterprise-grade SSDs, including Nvidia's flagship AI GPU product, the NVIDIA GB300, and eSD qualification tests with a number of very large data center operators.

The 256TB UltraQLC NVMe enterprise-grade SSD previously released by SanDisk on FMS in August 2025 is officially positioned as being specially designed for intensive AI-driven workloads such as data ingestion, preparation, and AI data lakes. It is used in hyperscale clouds and high-capacity application scenarios, and can greatly optimize TCO.

Enterprise-grade SSDs benefit from the core logic of AI hyperscale training/inference. The main thing is that in the multi-dimensional space of “throughput, latency, capacity, energy efficiency, and cost”, they just occupy the best place in the next tier of storage in the AI GPU/AI ASIC computing power cluster. There is an expensive but ultra-fast HBM/DRAM, and a cheap but too slow HDD/object storage. The middle layer “needs to be both big and fast, but also saves power and energy efficiency”. Naturally, the heavy responsibility falls on enterprise-grade NVMe SSDs.

Enterprise-grade NVMe SSDs have the best comprehensive balance between many core factors such as throughput, latency, energy efficiency, and cost. Naturally, they have become the “immediate storage layer” for computing power clusters such as AI GPUs, and are also the most dependent and incremental storage layer for AI training/inference and RAG workloads.

For example, enterprise-grade SSDs can greatly reduce cold startup and model switching costs in terms of inference and multi-model services. Many businesses require “on-demand loading” of different models or large language model segments, while SSDs become “model warehouses”. Enterprise-grade SSDs with low latency and high concurrent reading can significantly reduce cold boot time and avoid dragging down the overall QPS due to frequent loading weights. Furthermore, technical white papers and engineers' practical experience show that a high-dimensional vector library on an NVMe SSD is the best compromise between cost, performance, and capacity; with software-defined storage, RAG latency and throughput can be further optimized.

According to a McKinsey research report, the demand for ESSD in high-density NAND series products for generative AI and large model training/inference with very large parameters will bring 35% + enterprise SSD bit growth CAGR (compound annual growth rate) from 2024 to 2030. Among them, AI inference and RAG scenario bit growth (CAGR benchmark) may even exceed 100%, and the ESSD bit growth CAGR related to AI training may reach 62%.

“Storage” can be seen in all AI infrastructure projects, and the storage supercycle has already begun

If you look at the 500 billion US dollar “Stargate” AI infrastructure project and the nearly 1 trillion US dollar AI computing power infrastructure cumulative agreement that OpenAI has signed, it is difficult for these super AI infrastructure projects to break away from Nvidia's AI GPU computing power clusters and data center enterprise-grade high-performance storage products (with storage products such as HBM storage systems, enterprise-grade SSD/HDD, and server-level DDR5 as the core).

1762473199 (1) .jpg

In this unprecedented AI investment cycle centered around major AI model updates and AI data center expansion/construction, core AI computing power component manufacturers such as Nvidia are unquestionably the biggest winners; closely followed by high-end memory suppliers (SK Hynix, Samsung, Micron, etc.) represented by HBM, as well as enterprise-grade high-performance storage vendors (near-line HDDs and data center SSDs) serving AI data centers. These two links are driving the two-wheel AI investment cycle of “AI computing power x storage”. Among them, HBM storage systems are the first storage product tier closely following the AI GPU/AI ASIC computing power cluster, and enterprise-grade HDD/SSD, which follows, is another big winner in the AI infrastructure construction frenzy that undertakes the “big storage torrent” of AI data.

In an unprecedented “AI computing power competition” where the world accelerates the expansion of infrastructure closely linked to AI training/reasoning, Wall Street giants such as Morgan Stanley chanted the “storage supercycle”, and the surge in demand for enterprise-grade storage hard drives has boosted data storage product giants. The stock prices of Seagate, SanDisk, and Western Digital all rose by three digits this year, which can be described as significantly outperforming the US stock market and even the global stock market.

Morgan Stanley said in the research report that in the unprecedented wave of AI infrastructure fanaticism where large enterprises and various government departments around the world are spending huge sums of money to deploy AI, demand for core memory chips closely related to artificial intelligence training/inference systems is still extremely hot, driving a sharp increase in data center storage business revenue, including HBM storage systems, server-level DDR5, and enterprise-grade SSDs.

Recently, the most exciting news for Wall Street analysts is undoubtedly the “2025-2026 cumulative data center business revenue visibility of $500 billion,” given by Nvidia CEO Hwang In-hoon at the GTC conference in Washington at the end of October — that is, data center business revenue “accumulated over the next five quarters” of Blackwell and the next Rubin architecture AI GPU series products.

Demand for AI computing power continues to blowout around the world, and AI infrastructure investment projects led by the US government are getting bigger, and tech giants continue to invest huge sums of money to build large-scale data centers. This means that for investors who have long loved Nvidia and the AI computing power industry chain, the “AI belief” that has taken the world by storm “supercatalysis” to the stock prices of leading computing power players is far from over. They are betting on AI dominated by computing power leaders such as Nvidia, TSMC, Micron, SK Hynix, and Seagate and Western Digital The stock prices of companies in the computing power industry chain will continue to be interpreted” “Bull market curve”.