Red Hat has released version 1.3 of RHEL AI, which significantly enhances the capabilities of the enterprise-class Linux artificial intelligence platform. The core of this update is support for IBM Granite Large Language Model (LLM) and preview support for Intel Gaudi3 accelerator, which will greatly enhance enterprises' ability to deploy and apply AI. The new version is designed to simplify the deployment and integration of AI models, reduce costs, and provide the possibility of flexible deployment in hybrid cloud environments, creating more opportunities for service partners and system integrators.
Red Hat recently released the latest version 1.3 of its enterprise-level Linux artificial intelligence platform RHEL AI, which adds support for the IBM Granite large language model (LLM) and previews support for the Intel Gaudi3 accelerator. This update brings expanded opportunities for Red Hat’s service partners and system integrators to help enterprises more effectively explore and integrate artificial intelligence applications.
Joe Fernandez, vice president and general manager of Red Hat's AI business unit, said service partners and system integrators play an important role in helping the company realize different use cases. He noted that the sophistication of generative AI allows these partners to provide more cost-effective solutions to customers. Red Hat aims to reduce costs by using smaller models, simplify the complexity of integrating models with customer data and use cases, and provide flexibility to deploy these models in hybrid environments.
RHEL AI version 1.3 is designed for enterprise applications to develop, test and run generative artificial intelligence models. The update includes support for IBM's open source licensed Granite large language model, while also leveraging open source technology for data preparation. Users can leverage these components to create packaged, bootable RHEL images for deployment on individual servers in a hybrid cloud through the InstructLab Model Alignment Project, co-developed with IBM.
The new version supports the Granite3.08b English language use case and provides a developer preview. Users can experience the non-English language, code generation and function calling capabilities of the model. Subsequent versions will fully support these functions. In addition, RHEL AI also supports IBM Research's Docling open source project, which can convert common document formats into Markdown, JSON and other formats for generative artificial intelligence applications and training. The new version features context-aware chunking capabilities that take into account the structural and semantic elements of a document, aiming to improve the quality of generative AI responses.
This release also adds technology preview support for the Intel Gaudi3 accelerator, allowing users to process multiple requests in parallel in real-time processing and dynamically adjust LLM parameters during processing. Future RHEL AI versions will further support more document formats of Docling, integrate enhanced retrieval generation pipelines, and InstructLab knowledge tuning functions.
All in all, the release of RHEL AI 1.3 marks an important step for Red Hat in the field of enterprise-level AI. Its support for large language models and hardware acceleration will bring more powerful AI application capabilities and more flexible deployment to enterprise users. choose. The addition of more features in future versions will further consolidate its competitive position in the market.