Microsoft won’t stop buying AI chips from Nvidia, AMD, even after launching its own, Nadella says

AI Summary3 min read

TL;DR

Microsoft launched its own AI chip, Maia 200, but CEO Satya Nadella says the company will continue buying chips from Nvidia and AMD. The chip will be used by Microsoft's Superintelligence team for developing frontier AI models and will also support OpenAI's models on Azure.

Key Takeaways

  • Microsoft deployed its first homegrown AI chip, Maia 200, designed as an 'AI inference powerhouse' that outperforms competitors' chips.
  • Despite having its own chip, Microsoft will continue purchasing AI chips from Nvidia and AMD, maintaining partnerships with these vendors.
  • The Maia 200 chip will be used by Microsoft's Superintelligence team to develop the company's own frontier AI models, potentially reducing reliance on external model makers like OpenAI.
  • Cloud giants are developing their own AI chips due to supply constraints and high costs of obtaining the latest chips from Nvidia.
  • Microsoft's approach combines vertical integration with external partnerships, as stated by Nadella: 'Because we can vertically integrate doesn't mean we just only vertically integrate.'

Microsoft this week deployed its first crop of its homegrown AI chips in one of its data centers, with plans to roll out more in the coming months, it says.  

The chip, named the Maia 200, is designed to be what Microsoft calls an “AI inference powerhouse,” meaning it’s optimized for the compute-intensive work of running AI models in production. The company released some impressive processing-speed specs for Maia, saying it outperforms Amazon’s latest Trainium chips and Google’s latest Tensor Processing Units (TPU).  

All of the cloud giants are turning to their own AI chip designs in part because of the difficulty, and expense, of obtaining the latest and greatest from Nvidia — a supply crunch that shows no signs of abating.  

But even with its own state-of-the-art, high-performance chip in hand, Microsoft CEO Satya Nadella said the company will still be buying chips made by others. 

“We have a great partnership with Nvidia, with AMD. They are innovating. We are innovating,” he explained. “I think a lot of folks just talk about who’s ahead. Just remember, you have to be ahead for all time to come.” 

He added: “Because we can vertically integrate doesn’t mean we just only vertically integrate,” meaning building its own systems from top to bottom, without using wares from other vendors. 

That said, Maia 200 will be used by Microsoft’s own so-called Superintelligence team, the AI specialists building the software giant’s own frontier models. That’s according to Mustafa Suleyman, the former Google DeepMind co-founder who now leads the team. Microsoft is working on its own models to perhaps one day lessen its reliance on OpenAI, Anthropic, and other model makers.

Techcrunch event

TechCrunch Founder Summit 2026: Tickets Live

On June 23 in Boston, more than 1,100 founders come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately

Save up to $300 on your pass or save up to 30% with group tickets for teams of four or more.

TechCrunch Founder Summit: Tickets Live

On June 23 in Boston, more than 1,100 founders come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately

Save up to $300 on your pass or save up to 30% with group tickets for teams of four or more.

Boston, MA | June 23, 2026

The Maia 200 chip will also support OpenAI’s models running on Microsoft’s Azure cloud platform, the company says. But, by all accounts, securing access to the most advanced AI hardware is still a challenge for everyone, paying customers and internal teams alike.

So in a post on X, Suleyman clearly relished sharing the news that his team gets first dibs. “It’s a big day,” he wrote when the chip launched. “Our Superintelligence team will be the first to use Maia 200 as we develop our frontier AI models.”

Visit Website