Recently, Epoch AI research reevaluated ChatGPT's energy consumption, and the results showed that its energy consumption was much lower than previous expectations. This study not only corrects public misunderstandings about ChatGPT energy consumption, but also provides a new perspective for the future development of the AI industry. The research points out that the current average energy consumption of ChatGPT is much lower than earlier estimates, which helps to eliminate public concerns about high energy consumption of AI and promotes wider application of AI technology.
Recently, a study conducted by nonprofit organization Epoch AI revealed the energy consumption of ChatGPT, OpenAI’s chatbot platform, and the results showed that ChatGPT’s energy consumption was much lower than earlier estimates. According to some reports, ChatGPT needs about 3 watt hours of electricity to answer a question, and Epoch AI research believes that the data is overestimated. Research shows that when using OpenAI's latest default model, GPT-4o, the average queries consume only about 0.3 watt hours of electricity, a figure that is even lower than the energy consumption of many household appliances.
Joshua You, a data analyst at Epoch AI, said that concerns about AI energy consumption traditionally do not accurately reflect the current situation. The early 3-watt-hour estimation was based mainly on some outdated research and assumptions, when the assumption was that OpenAI used less efficient chips. You also pointed out that although the public has reasonable concerns about the future energy consumption of AI, there is no clear understanding of the current situation.
However, You also admit that the Epoch's 0.3 watt-hour figure is still an approximation because OpenAI does not disclose detailed energy consumption calculation data. In addition, this analysis does not take into account the energy consumption caused by some additional functions, such as image generation or input processing. For longer input queries, such as those with large amounts of files, You indicate that such queries may consume more energy.
Despite the low current energy consumption data, You expect energy consumption to rise in the future. He mentioned that with advances in AI technology, the energy requirement for training these models may increase, and future AI may also undertake more complex tasks, thereby consuming more power.
Meanwhile, AI infrastructure is expanding rapidly, which will lead to huge electricity demand. For example, according to a Rand company’s report, it is expected that AI data centers may require almost all of California’s 2022 electricity supply over the next two years. By 2030, the energy consumption of training a cutting-edge model will reach the power output of eight nuclear reactors.
OpenAI and its investment partners plan to invest billions in new AI data center projects in the coming years. With the development of AI technology, industry attention has also begun to shift to inference models, which are more capable when handling tasks but also require more computing power and power support.
For those who care about their AI energy consumption, you suggest that you can reduce the frequency of use, or choose a model with lower computing requirements to use.
Key points:
The average energy consumption of ChatGPT is 0.3 watt-hours, which is much lower than the earlier estimate of 3 watt-hours.
The increase in AI energy consumption is mainly related to future technological advances and the handling of more complex tasks.
OpenAI plans to invest heavily in expanding AI data centers in the coming years to meet growing electricity demand.
In short, Epoch AI research provides us with a clearer understanding of the current energy consumption status of ChatGPT and also reminds us to pay attention to the potential growth of AI energy consumption in the future. While enjoying the advancement of AI technology, we also need to actively explore more energy-efficient AI technologies and infrastructure to achieve sustainable development.