The rapid development of artificial intelligence has brought unprecedented computing demands and also brought huge energy consumption challenges. Thomas Graham, co-founder of the optical computing company Lightmatter, predicts that by 2026, the power consumption of global artificial intelligence data centers will reach a staggering 40 gigawatts, equivalent to the power consumption of eight New York cities. This prediction has raised concerns in the industry about the balance between the development of artificial intelligence and energy sustainability. This article will delve into this issue and analyze the optical computing solution proposed by Lightmatter.
Recently, the sharp rise in artificial intelligence computing demand has attracted widespread attention in the industry. Thomas Graham, co-founder of the optical computing startup Lightmatter, said in an interview with Bloomberg that it is expected that by 2026, multiple artificial intelligence data centers that require huge amounts of electricity will be built around the world, and their electricity consumption will be as much as that of New York City. Eight times as much.
Picture source note: The picture is generated by AI, and the picture authorization service provider Midjourney
In the interview, Graham mentioned that technology companies like Nvidia are currently expanding large computer facilities around the world to meet the demand for training large artificial intelligence models, such as GPT-4. As more AI models are put into production, the demand for computing power will continue to increase. He pointed out that as artificial intelligence moves from the research and development stage to the deployment stage, the demand for large-scale computers will rise significantly. He emphasized that the demand for inferential computing is growing at an exponential rate.
Graham also talked about Lightmatter's innovative technology. The company focuses on developing optical computing chips, a technology that connects multiple processors on a single semiconductor chip and replaces traditional network links with optical connections. This optical interconnect technology can transmit data with lower energy consumption and faster speeds, making data center network structures more efficient and economical.
He pointed out that at least a dozen new AI data centers are currently under construction or planned, and the power required by these centers reaches one gigawatt per center. While New York City's daily electricity consumption is about five gigawatts, the global artificial intelligence data center is expected to require 40 gigawatts of electricity in the future, which is equivalent to the electricity consumption of eight New York cities.
Lightmatter recently raised $400 million in venture capital, valuing the company at $4.4 billion. Graham said the company will enter production in the next few years. He is confident in expanding the artificial intelligence computing infrastructure, although he also mentioned that if new algorithms that can perform artificial intelligence calculations more efficiently appear in the future, it may affect the industry's investment in computing power.
The contradiction between the rapid development of artificial intelligence and energy consumption has become increasingly prominent. The energy-saving computing technology actively explored by companies such as Lightmatter provides new ideas for solving this problem. In the future, high-efficiency and energy-saving AI computing technology will become a key driving force for industry development, and it will also require the joint efforts of the entire industry to meet this challenge.