OpenAI raised $122 billion this week. According to internal projections, that capital buys them roughly 18 months of runway. Let that sink in for a moment. The most well-funded startup in history can burn through $122B in a year and a half and still need to raise again 🤯.
That tells you something important about the economics of frontier AI. The smartest models in the world are also becoming extraordinarily expensive to build and run. And that economic reality may end up creating one of the biggest opportunities for startups in the entire AI ecosystem.
Meanwhile, Anthropic just told users the flat-rate subscription era is over. Third-party tools routing through Claude Pro and Max subscriptions will now require separate pay-as-you-go billing. The all-you-can-eat buffet is closed.
This is what I’ve been saying for months: frontier intelligence is becoming too expensive to meter. And paradoxically, that may accelerate the entire AI ecosystem.
Because when frontier models get expensive, three things happen at once.
Open-weight models improve rapidly (still need to get a lot better!).
Enterprises start diversifying across providers.
And the infrastructure around models becomes far more valuable.
That’s the real opportunity.
A 27B parameter Qwen distill trained on Opus reasoning traces is now beating Claude Sonnet on SWE-bench while running locally on a $600 Mac Mini. That would have sounded absurd a year ago. That’s how fast the economics of AI are changing.
Hugging Face CEO Clement Delangue put it well. Comparing open models to closed APIs is like comparing an engine to a full car. But if you put the scaffolding work in, open systems can outperform what the benchmarks suggest…or get close.
The constellation of models isn’t optional anymore. It’s economic survival. No enterprise is going to sit on a single provider’s pricing whims when open-weight alternatives are closing the gap this fast. Sure, enterprises will hit the easy button and use Claude and OpenAI especially where SOTA is needed but they will also (I know they are) use other models as well for less mission critical use cases where open weight/source is pretty darn good.
And once enterprises run multiple models, something else becomes necessary.
And that means more models, more scaffolding, more infrastructure, and more startups needed to make it all work.
The foundation model companies are building the engines. But the largest companies in this new stack won’t necessarily be the ones building the engines. They’ll be the ones building everything around them.
Orchestration. Routing. Security.
The tooling that turns raw model capability into enterprise-grade products.
And the infrastructure that lets enterprises run models wherever it makes the most sense.
That’s where the opportunity to build and invest will be…
Switching gears…
🎙️Podcast alert - listen in!
VC: Investing at Inception in the Age of AI Agents on GTMnow
30 years of venture investing and I've never seen a market move this fast with so much uncertainty - which means huge opportunity! I sat down with Max Altschuler on the GTMnow podcast and we went deep on what I'm seeing across the portfolio and the market right now.
Going to point Chewbarka, my OpenClaw bot, to update my markdown files on me based on all this content!
A few things we covered:
The 5 P's - my framework for evaluating founders at inception.
The 3 CH's - how I think about working with founders after the check clears.
The AI jet stream - There are only two kinds of companies in this world: those in the AI jet stream and those who aren’t
The Clay story - $600K in year one, $4.6M in year two, then $30M, then $100M+. Years of patient iteration, low burn, and listening closely to where users were finding value. Shoutout to my boldstart ventures partner Eliot Durbin who's been there since early days and is on track to get a Clay tattoo???
Agent-native is the must-have bar - one of the first questions I ask founders now: how much of your company's code is written by agents?
The old playbooks are dead. Keep adapting. The opportunity has never been bigger.
Happy Easter 🐣 and Passover to those who celebrate!
As always, 🙏🏼 for reading and please share with your friends and colleagues!
Thanks for reading What's Hot 🔥 in Enterprise IT/VC! This post is public so feel free to share it.
#turns out that writing this newsletter 7 years in a row is a treasure trove for my openclaw bot Chewbarka to write several markdown files on me - will tell you more in the future!
🤯Physical AI just leveled up. 500,000 hours of physical data training the foundation. Then 1 hour to master any new task. That’s not a robot learning - that’s a robot that already understands the physical world.
GEN-1 is the moment physical AI goes from demo to intelligence. Team from DeepMind, Boston Dynamics, and OpenAI. Super pumped to have backed this team from Inception
#not digging into security this week (all I wrote last week is coming to fruition) but wow the first couple of days were insane - net net, North Koreans created a fake company to compromise an open source maintainer of popular package 👇🏻
#🤔 he’s right - compounding advantage for folks with GPUs and massive cash balances to subsidize token costs creates compounding advantage with data flywheel
#🏈 fumbled “Only 3.3% of Microsoft 365 users who try Copilot pay for it, AI infrastructure costs are exploding, and some analysts think customers may eventually skip Microsoft entirely and go straight to OpenAI or Anthropic.”