A new study warned that artificial intelligence technology could cause a significant surge in electricity consumption.

The paper, published in the journal Joule, details the potential future energy output of AI systems, noting that generative AI technology relies on powerful servers and that increased use could drive a spike in demand for energy.

The authors point to tech giant Google in one such example, noting that AI only accounted for 10%-15% of the company's total electricity consumption in 2021. 

But as AI technology continues to expand, Google's energy consumption could start to be on the scale of a small country.

VETERANS PLAGUED BY ERRORS IN HEALTH BENEFIT SYSTEM DUE TO COMPUTER MISHAP

sign for Google HQ in Mountain View, California

Google headquarters in Mountain View, California. (Marlena Sloss/Bloomberg via Getty Images)

"The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year), which is a significant increase compared to its historical AI-related energy consumption," the authors wrote.

They cautioned that such an example "assumes full-scale AI adoption utilizing current hardware and software, which is unlikely to happen rapidly."

Christopher Alexander, the chief analytics officer of Pioneer Development Group, told Fox News Digital the demands will be similar to the birth of Bitcoin mining, arguing developers will have to get creative with the way they use resources.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

"AI is very similar to Bitcoin mining. In both cases, processing power is used at very high intensity to solve problems. You cannot lessen the energy consumption, but you can mitigate it," Alexander said. "For example, alternative energy, like natural gas from oil drilling that is burned off rather than used is a major untapped energy source along with biogas from landfills."

Alexander likened the solution to when "kerosene was developed from waste," arguing that this is another opportunity to develop cheap energy from flare gas and landfills that powers the future and makes the most of resources that would otherwise become pollutants."

chatgpt artificial intelligence

This illustration shows the AI smartphone app ChatGPT surrounded by other AI apps. (Olivier Morin/AFP via Getty Images)

Phil Siegel, the founder of the Center for Advanced Preparedness and Threat Response Simulation (CAPTRS), told Fox News Digital that similar concerns are a feature of any growing technology, though he argued that improvements will likely come to help make energy consumption more efficient.

"Multiplayer gaming, social media and cryptocurrency have all gone through these phases. Early on, the technologies tend to be inefficient as the chips and algorithms are not optimized," Siegel said. 

CLICK HERE FOR MORE US NEWS

"People extrapolate these inefficiencies to a larger scale. The bad news is that energy usage does increase somewhat. The good news is that as the new uses scale, the chips get better, the algorithms improve, the technology gets more creative, and it eventually lowers the amount of energy usage far below panic levels."

While the paper acknowledges some of the scenarios are extreme and unlikely cases, it argues that it is important to temper "overly optimistic and overly pessimistic expectations" for the future, noting that "it is probably too optimistic to expect that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption."

wind turbines

Wind turbines at Duke Energy's Top of the World energy site are seen in Rollings Hills, Wyoming, on April 23, 2013. (AP Photo/Matthew Brown)

CLICK HERE TO GET THE FOX NEWS APP

"These advancements can trigger a rebound effect whereby increasing efficiency leads to increased demand for AI, escalating rather than reducing total resource use," the paper's conclusion states. 

"The AI enthusiasm of 2022 and 2023 could be part of such a rebound effect, and this enthusiasm has put the AI server supply chain on track to deliver a more significant contribution to worldwide data center electricity consumption in the coming years."