This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

Nvidia's ability to stay ahead of the curve is fascinating. Nvidia has dominated generative AI with GPUs that until recently were the only game in town for accelerated computing, but should a new buzzword turn up CEO Jensen Huang will be riding that wave too.

The company reported fourth quarter revenue of $22.1 billion, up 265% from a year ago, with earnings of $4.93 a share. Non-GAAP earnings for the fourth quarter were $5.16 a share. Wall Street was looking for Nvidia to report fourth-quarter non-GAAP earnings of $4.64 a share on revenue of $20.62 billion. For a bit of perspective, Nvidia quarterly revenue is approaching what it used to put up a year. In fiscal 2023, Nvidia revenue was $26.97 billion.

But it's worth noting how well Nvidia's previous bets have paid off--even the ones you probably already forgot about like the company's networking business, which is now on a $13 billion annual revenue run rate. I started thinking about Nvidia's bets after checking out Arista Networks' results. Arista Networks reported a strong quarter (as it usually does) and CEO Jayshree Ullal noted that generative AI will change networking infrastructure.

It's worth doing a long excerpt from Ullal just to outline the moving network parts with AI workloads. He said (emphasis mine):

"AI at scale needs Ethernet at scale. AI workloads cannot tolerate the delays in the network, because the job can only be completed after all flows are successfully delivered to the GPU clusters. All it takes is one culprit of worst-case link to throttle an entire AI workload.

Three improvements are being pioneered by Arista and the founding members of the Ultra Ethernet Consortium to improve job completion time. Number one, packet spraying. AI network topology needs packet spraying to allow every flow to simultaneously access all parts of the destination. Arista's developing multiple forms of load balancing dynamically with our customers.

Two is flexible ordering. Key to an AI job completion is the rapid and reliable bulk transfer with flexible ordering using Ethernet links to optimally balance AI-intensive operations, unlike the rigid ordering of InfiniBand. Arista is working closely with its leading vendors to achieve this.

Finally, network congestion. In AI networks, there's a common congestion problem whereby multiple uncoordinated senders can send traffic to the receivers simultaneously."

Cisco CEO Chuck Robbins was also optimistic that AI will drive networking demand but noted that "we're still in the early stages of AI workloads."

Simply put, the pilots in the network today will be in production in 2025. The network will be revamped in the near future. Companies like Cisco, Arista and Juniper Networks should benefit—unless Nvidia’s networking business gets a piece of the AI pie.

For instance, Arista said it won a handful of big deals, but one customer decided to stick to InfiniBand.

Remember InfiniBand, which was championed by Mellanox? Well, Mellanox was acquired by Nvidia in 2020 for $7 billion. The deal was announced in 2019. Here's what Huang said when the Mellanox deal closed: "With Mellanox, the new NVIDIA has end-to-end technologies from AI computing to networking, full-stack offerings from processors to software, and significant scale to advance next-generation data centers."

Nvidia also offers Ethernet gear as well as DPUs (data processing units) under its BlueField brand. Nvidia offers a set of networking gear and software for AI workloads in addition to its InfiniBand routers, gateways and switches.

Here's the big question: Will customers go the path of least resistance with AI-induced networking upgrades and a Nvidia bundle? Or will customers stick with networking specialists?

Today, Nvidia's Mellanox purchase probably wouldn't have been approved by regulators. The argument would be that Nvidia would have full-stack AI offerings. Can you imagine if Nvidia bought Arm like it wanted to?

Nvidia's networking was also a hot topic on Cisco's earnings call. Cisco and Nvidia have partnered on integrated AI systems. However, analysts were wondering whether Cisco's networking technology would be used in integrated systems. Robbins said the deal "would include our Ethernet technology with their GPUs" when connecting multiple clusters instead of Nvidia’s networking stack.

Robbins added that Nvidia is hoping to gain from Cisco's channel and go-to-market scale. Analysts were clearly worried about Cisco being threatened by Nvidia's Spectrum-X networking platform. Nvidia networking business appears to be in co-opetition with established networking vendors. Chalk it up as another bet that paid off for Nvidia.

Huang talked a bit about Nvidia's networking business, notably Spectrum-X. Spectrum X Ethernet has adaptive routing congestion control, noise isolation and traffic isolation. Huang said Spectrum X Ethernet will be an AI optimized system and InfiniBand will be an AI dedicated networking option. Simply put, Nvidia is going to get some of the AI-optimized networking pie too. Huang said on Nvidia's conference call:

"We're ramping Spectrum-X. We're doing incredibly well with Spectrum-X. It's our brand-new product into the world of ethernet. InfiniBand is the standard for AI-dedicated systems. Ethernet with Spectrum-X --Ethernet is just not a very good scale-out system.

But with Spectrum-X, we've augmented, layered on top of ethernet, fundamental new capabilities like adaptive routing, congestion control, noise isolation or traffic isolation, so that we could optimize ethernet for AI. And so InfiniBand will be our AI-dedicated infrastructure."

Nvidia is too often in the right place at the right time for it to be coincidence. Is it luck? A crystal ball? Time travel? Who knows, but consider:

  • Nvidia was a blockchain darling for a bit as demand for GPUs surged for bitcoin mining.
  • The metaverse--which was huge, then dead and likely to be huge again--is also powered by Nvidia.
  • Digital twins? Yep Nvidia.
  • Growth of the video game industry. Nvidia again.
  • Robotics. Nvidia.
  • Automobiles that will ultimately be autonomous. Huang can riff on the auto industry for days.

I could go on, but you get the idea. What's next? Quantum computing. Nvidia is building a stack for hybrid quantum computing featuring accelerated computing. Nvidia sees a world where GPUs and QPUs (quantum processing units) will work together. Rest assured Nvidia will ride the QPU wave too.