Amazon recently announced that it will spend US$110 million on artificial intelligence research, aiming to reduce its dependence on Nvidia and vigorously develop its independently developed Trainium chips. The editor of Downcodes learned that this program called "Build on Trainium" will focus on supporting the university's generative AI research and provide researchers with the opportunity to use Trainium chips to develop new AI architectures, machine learning libraries, and optimize large-scale Performance of distributed AWS Trainium UltraClusters.
The project will cover many aspects such as algorithm innovation, AI accelerator performance improvement, and large-scale distributed system research. Amazon has built a research UltraCluster containing up to 40,000 Trainium chips and promised to release the project results as open source. In addition, Amazon plans to provide funding to support new research and student education, and conduct multiple rounds of research awards.
Picture source note: The picture is generated by AI, and the picture authorization service provider Midjourney
AWS Trainium is a custom machine learning chip designed for deep learning training and inference tasks. Amazon said this investment is another major move after its $4 billion investment in Anthropic in August, demonstrating its determination to continue investing in the field of AI.
The "Build on Trainium" project has secured participation from Carnegie Mellon University's Catalyst research group. Todd C. Mowry, professor of computer science at the school, spoke highly of this and believed that the plan will significantly promote its efforts in tensor program compilation, machine learning parallelization, and language model services and tuning. research.
The project aims to promote the development of AI technology and reduce dependence on external chip suppliers. By open-sourcing research results and providing financial support, Amazon is committed to building a prosperous AI ecosystem.
This move by Amazon not only provides strong technical support for its own AI development, but also brings new opportunities and resources to the global AI research community. It is worth looking forward to its further breakthroughs in the AI field in the future. The editor of Downcodes will continue to pay attention to the progress of this project.