Are There No Moats in AI?
This week, I treated a tricky topic in business AI: moats.
I’ve called it “Part One” because this is dedicated to the applications side of it (the so-called “wrappers”) to highlight how not only you can, but now more than ever, it’s possible to get momentum, build something quite valuable, out of nowhere and with very low entry barriers.
Yet, that must be translated into vertical infrastructure, branding, and distribution to become a moat.
I’ve mapped all out there into the AI Competitive Moat Acid Test.

Back to the frontier models.
Or pretty much, the guys, like OpenAI, Anthropic, Meta, and Google, who need to keep pushing these AI models to the next level to maintain a competitive edge, thus also wide market shares of the AI market.
Many people are surprised when tech gets commoditized quickly.
But that’s a fact.
Yet that doesn’t mean you can’t build a competitive moat out of it, but it won’t be just on the core tech; but rather, it’s how that core teck will translate into accelerated growth, distribution, and branding, you get out of it to lock in a market.
In non-linear competition, I’ve explained the axioms of competition in a tech-driven world.

·
Feb 5

That isn’t that different from other domains, yet it’s much faster in tech. So, we often witness these cycles in the industry over a short period.
Also, as the market matures, tech progress will hit a plateau, and when that happens, market lock-ins and winner-take-all effects will come in.
But in AI, it’ll take another decade to get there. And yet, if you are a frontier model player, that’s the game you picked.
In addition, when you look specifically at the frontier model market, you have to look at it with a multi-faceted perspective and as the intertwining between three core elements of the whole ecosystem:

That’s also why, across the “frontier stack,” you will see fierce competition, massive capital expenditure, human capital poaching, and much more.
In short, outside pure GPU manufacturing, which will imply the build-up of a fab, if you’re a “frontier model,” you’ll need to get involved with core strategic partnerships on the hardware side while getting your way into the cloud infrastructure stack of the “AI foundational stack.”
This isn’t a race to grab the most valuable part of the market in the future, as without infrastructure, how do you serve a model in the first place, to billions of people, or to enable trillions of world’s tasks?
So that’s the key point.
For instance, in 2024, OpenAI’s business model pillars moved around three core areas:
Consumer and B2B tool (ChatGPT free and premium).API platform.And enterprise integrations.
Yet, as OpenAI moves ahead, it must also reach the infrastructure side.

Indeed, when you look at the “foundational ecosystem stack,” it has many moving parts.
Thus, it’s not just model development. Instead, that is only a facet of it. Indeed, when it comes to the GPU count, people often confuse the number of AI chips needed to train a model with the total number a frontier AI player must have available for the “frontier’s pipeline” at any time.
And even post-DeepSeek, this is shrinking, if at all, it’s putting even more pressure on these frontier AI companies to gather more, more advanced, and better-networked GPUs, as a 1% increase in the margins might result in a massive outcome for companies that are betting on AGI.
Thus, here are three core points to touch upon:
The full “AI innovation pipeline.”Another cost that is often overlooked, but that is the heavy part of the whole thing for now: talent!And, last, but quite important, If you believe that is happening in a few years, the Value of AGI.The post Are There No Moats in AI? appeared first on FourWeekMBA.