Anthropic co-founder Jack Clark recently pointed out in his newsletter "Import AI" that the development of artificial intelligence is not slowing down, but accelerating. He took OpenAI's o3 model as an example to illustrate that the progress of AI no longer relies solely on expanding the size of the model, but turns to a new model that combines reinforcement learning and more powerful computing power to achieve "thinking while running". This marks a new stage in AI development, and the integration of traditional methods and emerging technologies will become a future development trend.
Recently, another big news that has attracted attention in the field of artificial intelligence comes from Jack Clark, the co-founder of Anthropic. In his newsletter, Import AI, he refutes previous claims that AI progress is slowing down, arguing that its development is actually accelerating. Clark mentioned that the o3 model recently launched by OpenAI shows that AI still has a lot of room for growth, but its implementation methods are changing.
Unlike traditional methods, the o3 model does not simply drive progress by increasing the size of the model, but uses reinforcement learning and more powerful computing power. During the process of running the model, o3 has the ability to "think while running", which opens up new possibilities for further expansion. Clark predicts that this trend of combining traditional methods with emerging technologies will accelerate in 2025, when more companies will combine large basic models with new computing methods to promote AI progress.
However, Clark also pointed out that behind the rapid development there is a challenge that cannot be ignored-computing cost. He said that the advanced version of o3 requires 170 times the computing power of the basic version, while the resource consumption of the basic version has exceeded that of o1, and the energy consumption of o1 itself is also higher than that of GPT-4o. The resource requirements of these new systems vary from task to task, making cost predictions increasingly complex. In the past, model overhead was primarily related to model size and output length, but o3's flexibility makes this much less predictable.
Nonetheless, Clark firmly believes that through the combination of traditional scaling methods and new methods, AI development in 2025 will be more significant than before. When it comes to Anthropic's future plans, Clark's prediction sparked a lot of buzz. Anthropic has not yet released an "inference" or "test-time" model that competes with OpenAI's o-series or Google's Gemini Flash Thinking. The company's flagship model Opus3.5 was temporarily shelved due to insufficient performance improvement. However, the research and development of this model did not completely fail. Instead, it played an important role in training the new Sonnet3.5 model, which has become the most popular model on the market. Popular language models.
Highlight:
AI development is not slowing down, but accelerating, especially models that combine traditional and emerging computing methods.
The "think while running" capability of the o3 model opens up new expansion possibilities and injects power into the future development of AI.
Despite rapid progress, computational cost uncertainty remains a challenge for future development.
All in all, Clark's analysis shows us a new direction in the development of AI, that is, while pursuing higher performance, we need to pay attention to the control of computing costs. Future AI development will pay more attention to the balance between efficiency and cost, which will become one of the key factors driving the progress of AI technology. Although challenges still exist, the pace of innovation in the field of AI will not stop.