Microsoft’s Maia 200 targets cheaper AI inference

Jan 26, 2026

4:00pm UTC

Copy link
Share on X
Share on LinkedIn
Share on Instagram
Share via Facebook
M

icrosoft is throwing its hat into the ring to help deal with the AI chip shortage.

On Monday, the tech giant announced the Maia 200, its next-generation AI inference accelerator. Microsoft said the chip is its most efficient inference system the company has deployed yet, achieving 30% better cost-per-performance than existing systems.

Microsoft noted that this chip is three times faster than Amazon’s third-generation Trainium chip across common AI workloads and beats Google’s latest TPU in certain performance metrics. The system, which is currently live in Azure’s US Central region data centers, is powering models across Microsoft’s AI work, including its Foundry projects, Copilot suite, and superintelligence team.

Keep in mind that Microsoft designs its Maia AI chips but doesn't actually manufacture them. Production is still handled by specialized semiconductor partners.

This new Maia chip comes as the crunch for AI chips becomes more pronounced. Microsoft isn’t the only Big Tech firm working on its own silicon.

  • At Re:Invent in Las Vegas this past December, Amazon unveiled upgrades to its Trainium AI chips, called Trainium 3, claiming they are 4 times faster and 40% more efficient than previous iterations. It also teased Trainium 4, promising it would be twice as energy-efficient.
  • Google, meanwhile, is seeking to share the love with its own TPUs, discussing deals with Anthropic and Meta after seeing massive success with the chips in training its successful Gemini 3 models.

With AI chip costs soaring and AI chip makers struggling to keep up with demand, these moves could signal that other unlikely chip producers like Microsoft are looking to fill the void.

Our Deeper View

While Nvidia currently sits at the top of the AI chip food chain, raking in billions each quarter and holding its place as the world’s most valuable company, one company can’t do it all. Anyone selling AI chips right now is racing to meet the moment. Microsoft, Google, and Amazon putting their own chips on the market could help fill some gaps while also allowing AI developers to reduce their reliance on Nvidia.