Dell Technologies Fuels Enterprise AI Innovation with Infrastructure, Solutions and Services

In This Article:

  • Advancements to the Dell AI Factory — from industry-first AI PCs to edge and data center enhancements — simplify and speed AI deployments for organizations of any size

  • Powerful AI infrastructure and solutions, backed by a broad partner ecosystem and global services, empower organizations to embrace applications from building foundational models to running agentic AI

LAS VEGAS, May 19, 2025--(BUSINESS WIRE)--DELL TECHNOLOGIES WORLD-- Dell Technologies (NYSE: DELL), the world’s No. 1 provider of AI infrastructure,1 announces Dell AI Factory advancements, including powerful and energy-efficient AI infrastructure, integrated partner ecosystem solutions and professional services to drive simpler and faster AI deployments.

Why it matters

AI is now essential for businesses, with 75% of organizations saying AI is key to their strategy2 and 65% successfully moving AI projects into production.3 However, challenges like data quality, security concerns and high costs can slow progress.

The Dell AI Factory approach can be up to 62% more cost effective for inferencing LLMs on-premises than the public cloud4 and helps organizations securely and easily deploy enterprise AI workloads at any scale. Dell offers the industry’s most comprehensive AI portfolio designed for deployments across client devices, data centers, edge locations and clouds.5 More than 3,000 global customers across industries are accelerating their AI initiatives with the Dell AI Factory.6

Dell infrastructure advancements help organizations deploy and manage AI at any scale

Dell introduces end-to-end AI infrastructure to support everything from edge inferencing on an AI PC to managing massive enterprise AI workloads in the data center.

Dell Pro Max AI PC delivers industry’s first enterprise-grade discrete NPU in a mobile form factor7

The Dell Pro Max Plus laptop with Qualcomm® AI 100 PC Inference Card is the world’s first mobile workstation with an enterprise-grade discrete NPU.8 It offers fast and secure on-device inferencing at the edge for large AI models typically run in the cloud, such as today’s 109-billion-parameter model.

The Qualcomm AI 100 PC Inference Card features 32 AI-cores and 64 GB memory, providing power to meet the needs of AI engineers and data scientists deploying large models for edge inferencing.

Dell redefines AI cooling with innovations that reduce cooling energy costs by up to 60%9

The industry-first Dell PowerCool Enclosed Rear Door Heat Exchanger (eRDHx) is a Dell-engineered alternative to standard rear door heat exchangers. Designed to capture 100% of IT heat generated with its self-contained airflow system, the eRDHx can reduce cooling energy costs by up to 60%10 compared to currently available solutions.