At the SC24 International High-Performance Computing Conference held recently in Hamburg, Germany, Dell released a series of new products and services for AI applications. This move aims to help enterprises overcome challenges such as data quality, cost, energy consumption and sustainability encountered in the process of adopting generative AI and large language models (LLM), thereby improving the competitiveness of enterprises in the digital economy era. The new products and services launched by Dell cover servers, data lakes and a series of supporting services, aiming to provide enterprises with comprehensive AI infrastructure solutions.
At the International High-Performance Computing, Networking, Storage and Analysis Conference (SC24) held from November 17th to 22nd, Dell officially launched a series of new products and services designed to help enterprises overcome the challenges of adopting generative AI and large-scale languages. Challenges encountered during the model (LLM) process. Varun Chhabra, senior vice president of infrastructure and telecommunications product marketing at Dell, said that the main issues currently faced by enterprises when applying AI include data quality, cost, energy consumption and sustainability.
Analyst Dave Vellante pointed out that enterprises need to effectively utilize their data to maintain an advantage in a highly competitive digital environment. Dell's AI infrastructure, solutions and services are launched to meet this need and are designed to help enterprises better process data, extract insights, automate tasks and improve processes.
At this conference, Dell released three new servers focusing on AI workloads, namely PowerEdge XE7740 and PowerEdge XE9685L, as well as upgrades to the Integrated Rack 5000 (IR5000) series. PowerEdge XE7740 is equipped with Xeon6 processor, suitable for inference and fine-tuning, and has eight front-facing PCIe slots for high-speed GPU interconnection. Chhabra believes that this is the best platform for enterprises to start adopting AI.
Another PowerEdge XE9685L can support up to 96 NVIDIA Tensor Core H200 or B200 GPUs per rack to meet the needs of high-density computing and is especially suitable for AI training. Dell also launched a new plug-and-play solution, the Integrated Rack 5000 Series, which supports liquid and air cooling, is capable of accommodating 96 GPUs, and will be compatible with the upcoming NVIDIA GB200Grace Blackwell NVL4 super chip.
In addition, the latest version of Dell Data Lake House supports unified access control between the Trino query engine and the Apache Spark processing system, making data processing more efficient. Dell also cooperates deeply with NVIDIA in AI infrastructure and plans to support NVIDIA Tensor Core GPUs, including H200SXM cloud GPU, by the end of the year.
Finally, Dell also launched a series of new services, including data management services, sustainable data center services, AI network design services, and ServiceNow Assist implementation services, designed to help enterprises achieve more efficient AI applications and data utilization.
Highlight:
Dell launched a new AI infrastructure at the SC24 conference to help enterprises address AI adoption challenges.
The released PowerEdge servers have powerful computing capabilities and are focused on meeting the needs of AI workloads.
Dell has launched a number of services to support enterprises in achieving sustainable development and intelligent transformation.
All in all, the new product and service portfolio released by Dell this time demonstrates its determination and strength in assisting enterprises in AI transformation, and provides strong support for enterprises to apply AI efficiently and sustainably. These innovations will help companies better cope with the challenges of the AI era and stay ahead of the fierce market competition.