A new tool developed by Hugging Face engineer Julien Delavande is shedding light on how much electricity is used every time someone interacts with an AI model. Each AI prompt, even something as simple as sending a thank you message, requires the model to run on powerful hardware that uses significant energy. As the use of AI grows, experts expect that the electricity needed to support these technologies will also rise sharply.
Delavande’s tool aims to make users more aware of the environmental impact of AI. Some companies have started adopting energy-hungry approaches to keep up with growing AI demand, raising concerns about sustainability. By providing real-time estimates of energy consumption, Delavande hopes his tool will encourage more responsible use of AI.
The tool works with Chat UI, an open-source interface compatible with models like Meta’s Llama 3.3 70B and Google’s Gemma 3. It shows the amount of energy used for each message in either Watt-hours or Joules and offers comparisons with household appliances such as microwaves and LED lights. For example, generating a standard email with Llama 3.3 70B uses about 0.1841 Watt-hours, similar to using a microwave for a fraction of a second.
You may also like: What happens when you say please and thank you to ChatGPT
While the energy numbers provided are estimates and not exact measurements, they help remind users that there is a cost to every AI interaction. Delavande and his co-creators believe that transparency about energy use is important for the open-source community. They hope that, in the future, energy scores for AI models might be displayed as clearly as nutrition labels on food, giving everyone the chance to make informed choices about their digital footprint.