OpenAI now at 2x Anthropic ARR, still bets on scaling

By
Nat Rubio-Licht

Jan 19, 2026

9:00pm UTC

Copy link
Share on X
Share on LinkedIn
Share on Instagram
Share via Facebook
O

penAI just unleashed new data to dispel rumors of a teetering AI bubble.

On Sunday, the company said that its annual recurring revenue grew more than threefold in 2025, hitting $20 billion, up from $6 billion in 2024. Its compute, meanwhile, followed the same curve, growing 3.5 times from 0.6 gigawatts in 2024 to roughly 1.9 gigawatts in 2025.

In a blog post, OpenAI CFO Sarah Friar called these gains “never-before-seen growth at such scale,” noting that more compute would have opened the door for “faster customer adoption and monetization.”

  • Friar said in the blog post that the company’s 2026 goal was “practical adoption,” aiming to enable people to actually leverage all of the possible benefits of AI, particularly with “large and immediate” use cases in health, science, and enterprise.
  • Doing so is only possible with more compute, she said, and revenue is what “funds the next leap.”

Based on these numbers, OpenAI’s revenue is more than double that of rival startup Anthropic, which estimated in October that it would hit $9 billion in annualized revenue by the end of 2025. However, Anthropic's growth trajectory is still stronger: it grew from $1B in 2024, for a 9x growth rate, compared to OpenAI's 3.5x growth.

The difference, however, is that OpenAI plans to spend far, far more on building out AI infrastructure than Anthropic does. OpenAI has a total of $1.4 trillion in commitments over the next eight years to build AI data centers across dozens of deals with the likes of SoftBank, Oracle, Nvidia, Amazon, and more. Meanwhile, Anthropic primarily uses Amazon as an infrastructure partner and has a $50 billion deal with Fluidstack.

Though Anthropic may reach profitability first, OpenAI seems to believe it’s worth the risk.

“Securing world-class compute requires commitments made years in advance, and growth does not move in a perfectly smooth line,” Friar wrote. “At times, capacity leads usage. At other times, usage leads capacity."

Our Deeper View

OpenAI's returns from its eye-popping investments hinge entirely on adoption growth. However, there's a big question in the AI space right now about how much developers and users can do with powerful language models before returns start to diminish. That's why some of AI's most prominent figures, such as Yann LeCun, Fei-Fei Li and Ilya Sutskever, are calling into question the idea that the future of AI is based solely on scaling large language models, with several choosing to bet on world models for the next breakthrough. Make no mistake that OpenAI is pairing revenue and compute for a reason in this report from the CFO. It's making a connection between the two and sending a signal that it still believes in scaling laws and that more compute equals more revenue. Only time will tell who's right.