Amazon said it will invest another $2.75 billion into Anthropic to bring its total investment to $4 billion. The deal highlights the urgency of the generative AI arms race as hyperscalers create spheres of large language model influence.

Under the AWS partnership, Anthropic uses AWS as its primary cloud provider and uses AWS Trainium and Inferentia chips. Anthropic gets more distribution and heft behind its Claude model. In September, AWS and Anthropic outlined their initial partnership. AWS exercised its option to invest more in Anthropic.

The LLM orbits break down like this:

Today, it's clear that enterprise cloud and software giants are teaming up with LLM specialists as fast as possible. It's an arms race and why you'll need a chief AI officer to sort out the LLM strategy.

But the larger question is what happens when LLMs become commoditized. Of course, no one is thinking about that possibility yet since the party is just starting. These foundational models will lose importance as the game really becomes about customization with company-specific data.

Constellation Research's take

Dion Hinchcliffe:

"Ultimately, it's all about the data. If AI offerings can entangle themselves in their customers' data in a way that is beneficial for the customer, yet hard to leave, then it’s a win. Commodity offerings won’t matter as much when switching costs are high. Such switching costs involve data gravity, product skill switching, lost training time (weeks/months to train the new model on enterprise data), and especially a track record — or a lack thereof — of trust/privacy. AI is likely the new lock-in. Yes, this implies private LLMs are where the big money is, and that is likely where we’ll end up. Commodity AI gets the public model market, hyperscale offerings get the enterprise data market. Use of public models with enterprise data is also another avenue for non-commodity offerings."