Mistral AI has been making frequent moves recently. It released a major update to the Mistral7B v0.2 Base Model at the Cerebral Valley event. The model now supports 32K context, canceled the sliding window, and set the Rope Theta parameter to 1e6. What is more noteworthy is that Mistral AI cooperated with Microsoft and spent US$16 million to release the large-scale language model Mistral Large, directly benchmarking GPT-4, demonstrating its strong competitiveness and ambitious development goals in the field of large-scale language models. This indicates that competition in the AI field will become more intense in the future.
Mistral AI has recently open sourced the Mistral7B v0.2Base Model, announced at the Cerebral Valley event. After the update, it supports 32K context, cancels the sliding window, and sets Rope Theta to 1e6. Cooperated with Microsoft and invested US$16 million to release the Mistral Large model, directly competing with GPT-4. Mistral AI, which continues to develop, is committed to surpassing traditional competitors and releasing new models.Mistral AI’s series of actions indicate that it is actively expanding the market and is committed to taking a leading position in the field of large-scale language models. In the future, we will continue to pay attention to the development of Mistral AI and its performance in competition with other large language models.