IBM recently released version 3.1 of its open source language model Granite, which has significant improvements in architecture and performance. Granite 3.1 is more capable of processing text and code than previous versions, can handle up to 128,000 tokens at a time, and supports 12 languages and 116 programming languages. This model performs well on tasks such as information extraction, document summarization, and question and answer based on external data, providing developers with more powerful tools to assist various application scenarios.
Recently, IBM announced the launch of version 3.1 of its open source language model Granite. This update brings many important improvements. The new version of the model has been redesigned with a denser architecture and is capable of processing up to 128,000 tokens at a time. This update means Granite's ability to handle complex text and tasks has been significantly enhanced.
Granite version 3.1 models were trained on datasets from 12 languages and 116 programming languages, processing a total of 12 trillion tokens. This makes the model perform better in language understanding and generation, and can better meet the needs of users. IBM says the new models are particularly good at tasks such as answering questions using external data (RAG), extracting information from unstructured text, and creating document summaries.
Developers can now access these models through the Hugging Face platform, providing powerful support for various application scenarios. Granite was originally launched in May 2024, and this update marks IBM's continued progress and innovation in open source artificial intelligence.
IBM's Granite model is not only a technical improvement, but also provides developers and enterprises with more flexible and powerful tools, allowing them to process and analyze data more efficiently. As artificial intelligence technology continues to develop, the Granite model will continue to play an important role in helping various industries achieve digital transformation.
Through this series of improvements, IBM hopes to attract more developers to participate in the open source community and jointly promote the development of artificial intelligence technology. The release of Granite3.1 is not only a technological leap, but also a positive promotion of future language model research.
Project entrance: https://huggingface.co/collections/ibm-granite/granite-31-language-models-6751dbbf2f3389bec5c6f02d
Highlights:
The new Granite3.1 model has been redesigned to handle up to 128,000 tokens.
Model training data covers 12 languages and 116 programming languages, processing a total of 12 trillion tokens.
Developers can access these powerful open source language models through the Hugging Face platform.
The launch of Granite 3.1 version demonstrates IBM's continued investment and technical strength in the field of open source artificial intelligence. Its powerful performance and extensive language support will provide developers and enterprises around the world with more convenient and powerful AI tools and promote the progress and application of artificial intelligence technology. We look forward to the Granite model bringing more innovations and breakthroughs in the future.