Thinking Machines Lab, the AI startup founded by former OpenAI executive Mira Murati, has struck a deal worth several billion dollars with Google Cloud for cloud infrastructure to train and deploy AI models. The agreement gives the company access to Google Cloud services, including systems built around Nvidia GB300 accelerators, and it looks like another sign that serious AI training is becoming a three-way race: model talent, chip access, and whoever can keep the servers fed.

The non-exclusive contract means Thinking Machines can still work with other infrastructure providers. That is the sensible bit. In AI, locking yourself to one cloud too early can be a very expensive way to learn about vendor risk.

Google Cloud gets a high-profile AI customer

Google has been pushing hard to turn its cloud business into the default home for AI teams, bundling compute with storage, Kubernetes, and Spanner. The Thinking Machines Lab deal fits that pitch neatly, and it arrives as cloud providers compete not just on raw capacity but on how quickly they can deliver specialized hardware to the hottest startups.

What Thinking Machines is buying

Google said it can support the startup’s reinforcement learning workloads, a compute-hungry approach that has helped major AI labs make progress in their systems. The company also said Thinking Machines is among the first Google Cloud customers to get access to Nvidia GB300-based systems, which Google says deliver up to a twofold speed boost in training versus the previous generation.

  • Multi-billion-dollar Google Cloud agreement
  • Access to Google Cloud infrastructure for training and deployment
  • Systems based on Nvidia GB300 accelerators
  • Non-exclusive arrangement, so other providers remain in play

Murati’s startup keeps moving fast

Murati left OpenAI and launched Thinking Machines in February 2025. The startup later said it had raised $2 billion at a $12 billion valuation, while keeping product plans vague. One exception was Tinker, a tool announced in October that automates the creation of advanced AI models, which suggests the company is aiming for infrastructure-heavy, research-grade work rather than a consumer app with a shiny demo and a prayer.

The broader pattern is familiar: leading AI startups are spending aggressively on compute before they have fully explained the product roadmap. That makes these cloud deals look less like ordinary procurement and more like strategic positioning for the next round of model breakthroughs. Whether the payoff comes from better training, faster iteration, or simply being first to secure scarce hardware, the bill is already enormous.

Hardware access is the real test

For Google, landing Thinking Machines is useful proof that its latest AI stack is competitive. For Murati’s company, the bigger win is predictable access to the kind of infrastructure that can keep up with frontier model training. The open question is whether this becomes a template for more startups – or a reminder that only the best-funded teams can afford to play this game at all.

Source: 3dnews

Leave a comment

Your email address will not be published. Required fields are marked *