Dell Technologies expanded its AI factory portfolio and expanded its ecosystem with tighter partnerships with Nvidia, AMD and large language model companies with a host of new servers, networking gear and integrated systems.

The upshot: Dell is prepping AI factories for more than just cloud deployments in a bet that on-premises and air-gapped implementations will be just as important.

The announcements, delivered at Dell Technologies World, is the next phase of the company's strategy to bring AI factories to enterprises with infrastructure, an open ecosystem and services as it also sells to hyperscalers. Dell said it has more than 3,000 AI factory customers following a big push in 2024.

Varun Chhabra, Senior Vice President of Infrastructure Solutions Group (ISG), said Dell's approach to its AI factory strategy and various parts reflects the need to increase power while lowering power costs. "Our industry is facing a big challenge. GPU demand is skyrocketing because energy pace or energy capacity is struggling to keep pace," said Chhabra. "What we hear from customers most often as they talk about their retrofitting their data centers is how do they get the latest GPUs and get help with cooling and energy bottlenecks."

A screen shot of a computerAI-generated content may be incorrect.

Here's the lineup:

  • Dell PowerEdge XE9785 and XE9785L, which are servers that feature AMD Instinct MI350 Series AI GPUs, 8-way AMD Infinity Fabric interconnects, 288GB HBMe memory per GPU and liquid cooling as an option. The systems have 35 times better performance than the previous MI300X-based predecessor. Dell also said it is supporting AMD's AI stack.
  • Dell AI Factory built on Nvidia's stack that couple Dell hardware, Nvidia AI Enterprise, and managed services.
  • PowerEdge servers purpose built for model training and fine tuning. Dell PowerEdge XE9780/85/80L/85L servers can feature Intel or AMD CPUs and 8-way Nvidia HGX B300, more throughput and options for liquid or air cooling.
  • Dell PowerEdge XE7745 with RTX Pro 6000, which will be available in July, feature up to 8 Nvidia RTX Pro 6000 Blackwell Server Edition PCIe GPUs. The servers are also optimized for inferencing and acceleration and cluster networking.
  • Dell PowerEdge XE9712, which features Nvidia GB300 NVL72, and will have support for Nvidia Vera Rubin NVL144 and NVL576. This rack system is aimed at hyperscalers.

A screenshot of a computerAI-generated content may be incorrect.

  • PowerEdge servers will run Google Gemini models as part of Google Distributed Cloud.
  • Dell AI Platform with Intel will include Gaudi 3 AI accelerators coupled with an open-source software stack.
  • Dell PowerCool Enclosed Rear Door Heat Exchanger with Dell Integrated Rack Controller. The company said the system can lower cooling energy by 60% and enable customers to deploy 16% more racks with same power. For maintenance operations, Dell has hot-swappable fans, centralize monitoring and real-time insights.
  • A data platform designed to speed up throughput with its Project Lightning parallel file system. The system is designed to automate Iceberg table management, use LLMs within SQL and streamline data products and managed services. Dell AI Data Platform has a version that rides on Nvidia's models and software.
  • Dell AI Networking with low powered transceivers optimized for PowerEdge and PowerSwitch hardware.
  • Dell NativeEdge, which couples Nvidia GPUs with Dell's NativeEdge operating system for servers and endpoints. Dell has low-power AI accelerators on its NativeEdge gateways and endpoints. The company also includes NativeEdge Blueprints for Nvidia, GE Digital and Palo Alto Networks and discounted Nvidia AI Enterprise licenses.
  • Dell PowerSwitch SN5600, SN2201 and Nvidia Quantum-3 switches for Ethernet and Nvidia InfiniBand.
  • Partnerships to include software vendors and LLM players for on-premise AI factories. Models from Cohere, Google Cloud, Meta and Mistral AI are available as our Red Hat's stack.
  • Dell is also positioning its PCs as edge inferencing devices. To that end, the company launched new Dell Pro Max AI PCs, which feature neural processors from Qualcomm. The highest end model can inference a 70B parameter model.