The Zhitong Finance App learned that CoreWeave (CRWV.US), a leader in cloud AI computing power leasing with the title of “Nvidia's son”, became the focus of attention in the global stock market on Monday. The “New Cloud” technology company, which focuses on AI computing power leasing, announced that it has recently revised an important credit agreement in order to relax liquidity testing requirements. After the news came out, the cloud computing company's stock price once rose more than 5% in pre-market trading of US stocks. By the close of the US stock market last Friday, CoreWeave's market value was about 40 billion US dollars.
Overall, the company stated in its latest submission: “This revision to the DDTL 3.0 credit agreement brings this financing arrangement in line with the delivery schedule described by the parent company in its earnings conference call for the quarter ended September 30, 2025.”
CoreWeave stated in the document: “The First Amendment (First Amendment) contains amendments to several financial contracts in the DDTL 3.0 credit agreement, including: (i) reducing the minimum liquidity amount required for each monthly payment date on and after March 1, 2026 to $100 million; (ii) completely postponing the first test date of the debt payment coverage financial agreement to October 31, 2027, and postponing the first test date of the contract fulfillment ratio financial agreement until October 31, 2027 2026/2/28. The first amendment also allows an unlimited number of equity remedies (equity cures) for financial contracts that fail to meet the debt repayment coverage rate and contract fulfillment ratio until October 28, 2026; after that, equity remedies against these financial contracts can only be used for a maximum of three months during any period of four consecutive calendar months, and must not be used continuously for more than three calendar months.”
CoreWeave's revised credit agreement (DDTL 3.0) essentially “buys time for the pace of delivery and capital turnover, and ultimately reduces the probability of triggering a default in the short term”. These adjustments were described by the company as matching the delivery timing mentioned during the 2025 3Q results call.
Some Wall Street analysts commented on the X platform that for CoreWeave fundamentals and valuation prospects, this usually brings a signal of “short-term benefit and long-term differentiation”: from a short-term perspective, lowering the liquidity threshold and delaying testing of key financial indicators can significantly reduce the tail risk of “technical default or forced refinancing” due to repayments/cash consumption in early 2026, thereby easing market concerns about liquidity pressure. It is very easy for stock prices to receive emotional support in the short to medium term.
However, on the other hand, such revisions are also tantamount to admitting to the market that the “new cloud” company needs more relaxed contract space during the high capital expenditure and delivery climbing period under the big wave of AI infrastructure; while DDTL 3.0 itself is used to finance AI infrastructure purchases such as AI GPU computing power infrastructure clusters, and repayment will enter the “hard constraint” stage of cash flow from April 2026. If delivery and repayment fall short of market expectations, it may still cause dilution and fluctuation through equity remediation/refinancing in the future.
How sacred is CoreWeave, which has the title of “son of Nvidia”?
As the earliest cloud leasing user of Nvidia graphics processors (AI GPUs) in the data center field, CoreWeave won the favor of Nvidia's venture capital department by seizing the wave of demand for AI computing power resources in data centers, and was even able to prioritize the extremely high demand Nvidia H100/H200 and Blackwell series AI GPUs, which forced cloud service giants such as Microsoft, Google, and Amazon to rent cloud AI computing power resources from CoreWeave.
As early as August 2023, CoreWeave became the first cloud computing service company to deploy the Nvidia H200 Tensor Core GPU. This is a high-performance AI GPU, which enables it to provide customers with extremely powerful computing capabilities. Driven by the AI wave, especially in 2023, CoreWeave's popularity in the cloud AI GPU computing power market grew rapidly due to large-scale procurement of high-end NVIDIA AI GPUs (such as H100/H200) and comprehensive cooperation with Nvidia in the CUDA software and hardware collaboration ecosystem.
The most prominent feature of the CoreWeave AI cloud computing power rental service is that it focuses on providing high-end AI GPU (especially NVIDIA GPU) clusters in large quantities, so that users can obtain high-performance AI GPU computing power resources as needed in the cloud service system — that is, cloud AI computing power resources, for AI workloads such as machine learning, deep learning, and inference. CoreWeave supports large-scale flexible deployment. Users can quickly increase or decrease the number of AI GPUs according to project requirements, suitable for AI model training (such as large language models, computer vision systems, etc.) and huge inference workloads requiring real-time processing. In addition to AI, CoreWeave's Nvidia AI GPU resources can also be used in traditional HPC scenarios (scientific computing, molecular simulation, financial risk analysis, etc.).
The current global demand for AI computing power resources has undoubtedly continued to explode. This is why the valuations of cloud AI computing power leasing leaders such as Fluidstack and CoreWeave have continued to expand since this year. AI computing power resource requirements, which are closely related to AI training/inference, have pushed the capacity that the underlying computing power infrastructure clusters can meet to the limit, and even large-scale AI data centers that have continued to expand recently cannot meet the extremely strong computing power demand on a global scale.
After Google launched the Gemini 3 AI application ecosystem in late November, this cutting-edge AI application immediately became popular all over the world, driving an instant surge in demand for Google's AI computing power. Once released, Gemini3 series products brought huge AI token processing capacity, forcing Google to drastically reduce the amount of free access to Gemini 3 Pro and Nano Banana Pro, and also imposed temporary restrictions on Pro subscribers. Combined with South Korea's recent trade export data, demand for HBM storage systems and enterprise-grade SSDs continues to be strong, further verifying that “the AI boom is still in the early stages of construction where computing power infrastructure is in short supply.”