Downcodes editor reports: At the International High-Performance Computing, Networking, Storage and Analysis Conference (SC24) recently held in the United States, Dell released a series of products designed to help enterprises overcome generative AI and large language model (LLM) application challenges. new products and services. Dell has launched comprehensive solutions to address the data quality, cost, energy consumption and sustainability issues faced by enterprises in AI applications, aiming to help enterprises maintain their leading edge in the highly competitive digital environment. These new products and services cover multiple levels of hardware, software and services, reflecting Dell's continued investment and innovation in the field of AI infrastructure.
Analyst Dave Vellante pointed out that enterprises need to effectively utilize their data to maintain an advantage in a highly competitive digital environment. Dell's AI infrastructure, solutions and services are launched to meet this need and are designed to help enterprises better process data, extract insights, automate tasks and improve processes.
At this conference, Dell released three new servers focusing on AI workloads, namely PowerEdge XE7740 and PowerEdge XE9685L, as well as upgrades to the Integrated Rack 5000 (IR5000) series. PowerEdge XE7740 is equipped with a Xeon6 processor, suitable for inference and fine-tuning, and has eight front-facing PCIe slots for high-speed GPU interconnection. Chhabra believes that this is the best platform for enterprises to start adopting AI.
Another PowerEdge XE9685L can support up to 96 NVIDIA Tensor Core H200 or B200 GPUs per rack to meet the needs of high-density computing and is especially suitable for AI training. Dell also launched a new plug-and-play solution, the Integrated Rack 5000 Series, which supports liquid and air cooling, is capable of accommodating 96 GPUs, and will be compatible with the upcoming NVIDIA GB200Grace Blackwell NVL4 super chip.
In addition, the latest version of Dell Data Lake House supports unified access control between the Trino query engine and the Apache Spark processing system, making data processing more efficient. Dell also cooperates deeply with NVIDIA in AI infrastructure and plans to support NVIDIA Tensor Core GPUs, including H200SXM cloud GPU, by the end of the year.
Finally, Dell also launched a series of new services, including data management services, sustainable data center services, AI network design services, and ServiceNow Assist implementation services, designed to help enterprises achieve more efficient AI applications and data utilization.
All in all, the new products and services launched by Dell at the SC24 conference demonstrate its strong strength in the field of AI and its deep understanding of the needs of enterprise customers. These innovations will effectively help enterprises accelerate AI transformation and gain competitive advantages in the wave of digitalization. We look forward to Dell continuing to bring more breakthroughs in the field of AI in the future!