Google Cloud is leaning harder on Intel for the unglamorous but necessary side of AI infrastructure: CPUs, inference, and the plumbing that keeps data centers from becoming very expensive space heaters. The two companies said on Thursday that they are extending their multiyear Google Cloud and Intel AI chip partnership, with Google Cloud continuing to use Intel’s Xeon line, including Xeon 6, while the pair keep co-developing custom infrastructure processing units.

That is a practical bet, not a flashy one. GPUs still get the headlines because they train the models, but CPUs do plenty of the day-to-day work once those models have to run at scale. And as AI adoption spreads from demos to production systems, the bottleneck shifts from raw training horsepower to efficient serving and data-center management.

Xeon stays in Google Cloud’s AI stack

Google has used Intel’s Xeon processors for decades, so this is more evolution than romance. The new deal keeps that relationship alive for AI, cloud, and inference tasks, which is a neat way of saying Intel remains embedded in the infrastructure layer where efficiency matters as much as raw speed.

The companies are also broadening work on custom ASIC-based IPUs, which offload data center tasks from CPUs. That kind of chip specialization is exactly what hyperscalers have been chasing for years: every workload moved off a general-purpose processor is a little more margin, a little more capacity, and a little less dependence on the same scarce chips everyone else wants.

Why CPUs are suddenly the hot part of AI

  • GPUs are still the preferred silicon for training AI models.
  • CPUs are essential for running those models and for broader AI infrastructure.
  • Intel says the market needs balanced systems, not just more accelerators.

Intel chief executive Lip-Bu Tan framed the update as a response to a broader industry shift toward balanced systems, not accelerator-only builds. That messaging lines up with a simple reality: if everyone chases the same shiny chips, the old boring ones become strategic again.

The timing is no accident. Intel’s push lands as the industry talks more openly about CPU shortages, and Arm recently introduced its own Arm AGI CPU in a sign that even the company best known for licensing cores wants a bigger slice of the AI hardware story. Google and Intel are not trying to win the hype cycle here; they’re trying to make sure the data center keeps running when the hype dies down.

Google Cloud and Intel extend their AI chip partnership

Intel declined to share pricing for the expanded arrangement, which is standard corporate theater and also a reminder that these infrastructure deals are as much about leverage as technology. The interesting question now is whether more cloud providers follow Google’s lead and lock in deeper CPU partnerships, or whether the current shortage just makes everyone hedge with more vendors and more custom silicon.

Source: Techcrunch

Leave a comment

Your email address will not be published. Required fields are marked *