It has been reported that AI, specifically OpenAI's popular chatbot ChatGPT, consumes a significant amount of electricity. The New Yorker estimates that ChatGPT uses over half a million kilowatt-hours of electricity to respond to 200 million requests daily. To put this into perspective, the average US household consumes about 29 kilowatt-hours per day. Therefore, ChatGPT uses over 17,000 times the amount of electricity used by the average household.
If generative AI technology becomes more widely adopted, it could lead to even greater electricity consumption. For instance, if Google were to integrate generative AI technology into every search, it could consume up to 29 billion kilowatt-hours annually. This is more electricity than countries like Kenya, Guatemala, and Croatia consume in a year.
According to Alex de Vries, a data scientist for the Dutch National Bank, "AI is just very energy-intensive. Every single one of these AI servers can already consume as much power as more than a dozen UK households combined. So the numbers add up really quickly." However, estimating the exact amount of electricity consumed by the AI industry is difficult due to the variability in how different AI models operate, and the lack of transparency from Big Tech companies that drive the industry.
De Vries estimated in a paper that by 2027, the entire AI industry could consume between 85 to 134 terawatt-hours annually. Nvidia, which is often referred to as "the Cisco of the AI boom," reportedly has about 95 per cent of the market share for graphics processors. This calculation was based on numbers released by Nvidia. De Vries stated that "You're talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027," which he believed to be a significant number.
Some of the world's most high electricity-use businesses, such as Samsung, pale in comparison. According to BI's calculations based on a report from Consumer Energy Solutions, Samsung uses almost 23 terawatt-hours, Google uses a little over 12 terawatt-hours, and Microsoft uses a bit more than 10 terawatt-hours to run data centers, networks, and user devices. OpenAI has not yet responded to a request for comment from BI.