The battle for computing power heats up: Google's open source strategy+Meta ecosystem defies to break Nvidia's CUDA ecological monopoly

Zhitongcaijing · 3d ago

The Zhitong Finance App learned that, according to reports, Google (GOOGL.US) is trying to weaken Nvidia (NVDA.US)'s advantage built on the CUDA software platform and has received some support from Meta (META.US).

According to people familiar with the matter, the search giant is trying to make its own AI chip TPU run the artificial intelligence framework PyTorch more smoothly. It is worth noting that since PyTorch was officially released in 2016, the framework has long formed a deep binding relationship with the Nvidia CUDA platform, and Meta is the key creator of this open source ecosystem.

The source revealed that Google is still weighing whether to open source part of the code to increase customer adoption. The company internally named this project TorchTPU and is putting more resources and effort into it.

Google has partnered with Meta to enable PyTorch to run natively and losslessly on Google's TPU at scale. This means that developers no longer need to rewrite code to use Google chips, drastically lowering the threshold for migrating from Nvidia to the Google camp.

It's worth mentioning that Meta, as one of Nvidia's biggest customers, is actively seeking alternatives to address the high price and supply bottlenecks of Nvidia chips.

It was reported last month that Meta is negotiating a multi-billion dollar deal with Google to rent a Google Cloud TPU in 2026 and may directly purchase its chips for its own data center in 2027 to reduce its dependence on Nvidia and its GPUs.

From the perspective of technical collaboration, Meta is responsible for providing the application ecosystem (PyTorch), Google is responsible for providing the underlying computing power hardware (TPU), and the two sides have teamed up to build a fast channel to “bypass CUDA” at the software layer. whereas

On the other hand, the reason why Nvidia's leading position in the AI field is difficult to shake is not only because GPUs are strong, but also because CUDA, a software platform, has become the “standard language” for AI development.

Meta and Nvidia did not immediately respond to this.

Meanwhile, a Google spokesperson confirmed the plan and indicated that the relevant news had already been publicly announced in October.

“Google Cloud is committed to providing customers with a full link of choices from models and accelerators to frameworks and tools,” the spokesperson said via email. “PyTorch is extremely popular, and our goal is to make the experience on Google Cloud TPU seamless. We're seeing significant and accelerated growth in demand for both TPU and GPU infrastructure. Our focus is on giving developers the flexibility and scale they need, no matter what hardware build they choose.”

By the close of Wednesday, Google and Nvidia shares both fell by more than 3%, while Meta shares also fell by more than 1%.