Sonar, a super fast AI search model created by Cerebras Systems and Perplexity AI, was officially unveiled, challenging the dominance of traditional search engines with its amazing speed of processing 1,200 markers per second. The Sonar model is based on Meta's Llama3.370B and runs on Cerebras' dedicated AI chip, achieving unprecedented search speed and efficiency. This cooperation marks a new era of AI-first search experience, and also indicates that the market competition for high-performance AI reasoning hardware will become increasingly fierce.
Recently, Cerebras Systems and Perplexity AI announced a cooperation to jointly launch a new ultra-fast AI search model, Sonar, aiming to challenge the dominance of traditional search engines. The core of the collaboration lies in the Sonar model, which runs on Cerebras’ dedicated AI chips, with a speed of 1,200 marks per second, making it one of the fastest AI search systems on the market.
The Sonar model is based on Meta's Llama3.370B, marking a new AI-first search experience, with both parties hiring high hopes for its fast performance. "The collaboration with Cerebras is crucial to the implementation of Sonar. Cerebras' cutting-edge AI reasoning infrastructure has enabled us to achieve unprecedented speed and efficiency," said Denis Yarats, CTO of Perplexity.
The timing of this collaboration is critical, amid the craze for Cerebras to launch DeepSeek technology, which demonstrates 57 times faster than traditional GPU solutions. Cerebras is rapidly rising to become the preferred provider of high-speed AI reasoning.
According to Perplexity's internal test results, Sonar significantly surpassed the GPT-4o mini and Claude3.5Haiku in terms of user satisfaction metrics, and is comparable to the more expensive model Claude3.5Sonnet in terms of accuracy. Sonar scored 85.1 on factual accuracy, while GPT-4o scored 83.9 and Claude3.5Sonnet scored 75.8.
Cerebras CEO Andrew Feldman pointed out that dedicated hardware is becoming a new battlefield for AI companies to compete for market share. He believes that technological progress will not only not narrow the market, but will instead expand the market size. Industry analysts also believe that the collaboration could force traditional search providers and other AI companies to rethink their hardware strategies.
However, whether dedicated AI chips can match traditional GPU solutions in terms of scalability and cost-effectiveness remains a question to be answered. Although Cerebras demonstrates significant speed advantages, it is still a challenge to convince customers that the value of performance-enhancing can cover potentially high prices.
For Perplexity, Cerebras' collaboration helps it build competitiveness in the enterprise search market. Sonar will be open to Pro users first and will then expand to a wider user base. Although the two companies did not disclose the financial terms of the cooperation, the move will undoubtedly accelerate competition in the AI search field.
Entrance: https://sonar.perplexity.ai/
Key points:
Cerebras and Perplexity jointly launched the Sonar model, with speeds of 1,200 markers per second, challenging traditional search engines.
Sonar surpasses many well-known AI models in terms of user satisfaction and accuracy, showing strong competitiveness.
Cooperation may drive traditional search providers to rethink hardware strategies and drive changes in the corporate search market.
The emergence of Sonar not only brought about a significant increase in search speed, but also brought a new competitive landscape to the AI search field. In the future, the application of dedicated AI chips in the search engine market will be worthy of continuous attention, and their development may reshape the entire industry ecosystem.