NVIDIA has announced that Dell Technologies, Hewlett Packard Enterprise (HPE) and Lenovo will be the first to integrate NVIDIA Spectrum-X Ethernet networking technologies for AI into their server line-ups to help enterprise customers speed up Generative AI workloads.
Purpose-built for Generative AI, Spectrum-X offers enterprises a new class of Ethernet networking that can achieve 1.6x higher networking performance for AI communication versus traditional Ethernet offerings.
The new systems coming from three of the top system makers bring together Spectrum-X with NVIDIA Tensor Core GPUs, NVIDIA AI Enterprise software and NVIDIA AI Workbench software, to provide enterprises with the building blocks to transform their businesses with Generative AI.
“Generative AI and accelerated computing are driving a generational transition as enterprises upgrade their data centres to serve these workloads,” said Jensen Huang, Founder and CEO of NVIDIA. “Accelerated networking is the catalyst for a new wave of systems from NVIDIA’s leading server manufacturer partners to speed the shift to the era of Generative AI.”
Antonio Neri, President and CEO, HPE, added: “Generative AI will undoubtedly drive innovation across multiple industries. These powerful new applications will require a fundamentally different architecture to support a variety of dynamic workloads. To enable customers to realise the full potential of Generative AI, HPE is partnering with NVIDIA to build systems with the required power, efficiency and scalability to support these applications.”
Yuanqing Yang, Chairman and CEO, Lenovo, said: “Generative AI can power unprecedented transformation but places unprecedented demands on enterprise infrastructure. Working closely with NVIDIA, Lenovo is building efficient, accelerated systems with the networking, computing and software needed to power modern AI applications.”
Networking purpose-built to accelerate AI
For peak AI workload efficiency, Spectrum-X combines the extreme performance of the Spectrum-4 Ethernet switch; the NVIDIA BlueField-3 SuperNIC, a new class of network accelerators for supercharging hyperscale AI workloads; as well as acceleration software. Spectrum-X complements BlueField-3 DPUs, one of the world’s most advanced infrastructure computing platforms.
Spectrum-4 is the world’s first 51Tb/sec Ethernet switch for AI, providing highly effective data throughput at scale and under load while minimising network congestion for multi-tenant, AI cloud workloads. Its intelligent, fine-tuned routing technology enables maximum utilisation of network infrastructure at all times.
BlueField-3 SuperNICs are designed for network-intensive, massively parallel computing, offering up to 400Gb/s RDMA over Converged Ethernet (RoCE) network connectivity between GPU servers and boosting performance for AI training and inference traffic on the east-west network inside the cluster. They also enable secure, multi-tenant data centre environments, ensuring deterministic and isolated performance between tenant jobs. Boasting a power-efficient, half-height, half-length PCIe form factor, BlueField-3 SuperNICs are ideal for enterprise-class servers.
NVIDIA AI Enterprise provides frameworks, pre-trained models and development tools for secure, stable and supported production AI. NVIDIA AI Workbench allows developers to quickly create, test and customise pre-trained Generative AI models on a PC or workstation – then scale them to virtually any data centre or cloud.Click below to share this article