Weekly Newsletter

18 March 2024

Weekly Newsletter

18 March 2024

How “energy-hungry AI” is straining grids and spreading climate disinfo

Touted as a climate change solution, AI has been criticised for its vast energy demand, and its history of “turbocharging disinformation”.

Eve Thomas March 14 2024

Artificial Intelligence (AI) has previously been heralded as a potentially invaluable tool in the fight against climate change, but a new report by several green groups has found that it in fact poses “two significant and immediate dangers” – its vast and increasing energy requirements, and its propensity for “turbocharging disinformation”.

Industry excitement about artificial intelligence (AI) has previously given rise to the suggestion that AI tools could be part of the climate solution by the likes of predicting energy generation and demand, optimising maintenance and enabling better management of assets in the industry.

However, speaking to Power Technology, Charlie Cray, a senior research specialist at Greenpeace, which was involved in the report, says: “While the potential for AI to contribute to climate solutions is just beginning, the challenges we outline are already happening and need to be addressed quickly.”

AI's insatiable energy demand will continue to grow

In particular, the report highlights AI’s “massive” energy requirements. It notes that a single AI search query requires 10 times the energy of a Google search, and points out that the International Energy Agency (IEA) estimates that the energy use from data centres responsible for powering AI will double by 2026. This would bring AI’s energy demand to over 1,000TWh – roughly equivalent to the electricity consumed by Japan.

These requirements will strain energy supplies, possibly inhibiting industry attempts to transition away from traditional and polluting energy sources. Already, concerns have been raised about grid strain in the US. In Kansas, where Meta is building a data centre, Evergy announced that it would delay the retirement of its coal plant by five years, for example, while in Northern Virginia, home of 'data centre alley', Dominion Energy paused new data centre connections in 2022.

Cray tells Power Technology: “Energy-hungry AI could also put more strain on electrical grids and contribute to rising energy prices for everyday ratepayers which places the greatest burden on low-income people.”

Expanding on this, GlobalData analyst Josep Bori adds: “AI is very energy intensive when it comes to the training of the underlying neural network models, which require significant computing power to run what essentially are multivariate linear regressions on massive amounts of data. For instance, a generative AI model like OpenAI's GPT-3 had 175 billion parameters or nodes, whereas the more recent GPT-4 has 1.76 trillion and Google's Gemini Ultra has 1.56 trillion.

“To train them we require data centres full of graphics processing units, from companies like Nvidia or AMD, which consume a lot of power. It takes approximately 25,000 Nvidia A100 GPUs to train GPT-4. To give you an easier comparison, it was estimated that, in 2022 when GPT-3 was launched, training and running it emitted almost 500 times the amount of carbon than that of a flying passenger in a New York to San Francisco round trip.”

AI's long-term potential

However, supporters of AI are optimistic that the energy requirements are justified by AI’s potential in the industry. Indeed, in a survey of 386 people as part of GlobalData’s Tech Sentiment Polls Q4 2023 across its network of B2B websites, 92% responded that AI would either live up to all of its promise or that it was hyped but that they could still see a use for it.

Maya Sherman, AI literacy lead at GPAI, tells Power Technology that she expects “to see an increased usage of AI in climate domains to facilitate communication and access in rural areas and offer greater analytics to climate data points in real time. The ability to provide those insights and smart recommendations for agriculture, food and water supply, and waste management will revolutionise the work of informal climate workers.”

Other emerging use cases for AI include the monitoring of supply chains and assets to identify maintenance needs and maximise efficiency. Algorithmic models could further streamline efforts to clean up the energy sector by predicting generation and demand, reducing unnecessary emissions and balancing out its own significant demand on power grids.

GlobalData analyst Christopher Papadopoullos comments: “Over the longer term, there may be some ways AI helps to reduce emissions. It can be used to develop new battery chemistries and run machinery and plants more efficiently. And if we see more tech companies investing more in their own utilities, we might see some innovation there we wouldn’t have seen from the traditional power companies.”

Bori concurs, pointing out that AI can be powered renewably, creating demand for a growing market.

“Efforts are underway to make AI more energy-efficient and environmentally sustainable, including using renewable energy sources like solar and wind power ... arguably, if demand for renewable energy increases, it will drive prices up which in turn will attract more investment in renewable energy infrastructure and production assets, ultimately accelerating the green transition,” he says.

The disinformation danger

The report raised further concerns around AI role as a tool to “spread climate disinformation” following the World Economic Forum’s Global Risks Report 2024, which found that “misinformation and disinformation are [the] biggest short-term risks, while extreme weather and critical change to Earth systems are greatest long-term concern.”

The report points out the inextricability of these two highlighted risks, as one perpetuates another.

Cray explains: “Beyond the direct climate impacts of AI, it also has the potential to accelerate climate disinformation, making it harder to build political and social support for the actions needed to address the climate crisis.”

Examples of disinformation mentioned in the report include the false claims perpetuated in 2023 that several whale deaths on the east coast of the US were attributable to offshore wind projects. The claims drew significant attention, with Donald Trump claiming that the "windmills" made the whales "batty".

Despite claiming that the deaths were not related to the wind farms in August, Ørsted scrapped plans to build offshore wind farms on the coast of New Jersey by October. This followed delays after two community groups filed lawsuits regarding the whales.

The report further noted that, in 2022, climate disinformation tripled on platforms such as X. It suggested that: “There is little incentive for tech companies to stop disinformation, as reports show companies like Google/YouTube make an estimated $13.4m per year from climate denier accounts.”

Sherman acknowledges AI's capacity for good, noting that most climate disinformation is not the fault of the technology but the malicious intent of its users.

“The problem is that we do not want to censure autonomous users and their ability to create online content since not all use cases are negative,” she says.

“The question of climate disinformation has substantial socio-political roots, exceeding the role of AI as a power multiplier. Notably, AI is involved in the spread and generation of disinformation as it enables automated content creation. As a tool designed by human developers, malicious actors can use it for nefarious purposes, as seen in the Covid-19 pandemic and different conflicts.”

Can AI ever become a climate ally?

For AI to become a positive cog in the transitioning energy machine, Sherman suggests it must exist in more “structured” role.

She explains: “The fact that there are more cases of AI-driven disinformation does not necessarily mean that AI is the core problem, but that we cannot allow this regulatory void, and have a more structured presence of online activities in the genAI era.”

The report agrees but suggests that major technology companies (specifically Meta, Google and OpenAI) should not be trusted to develop the market safely, arguing that they “have shown their focus to be profit over safety time and again”.

Considering energy demand, it notes a recent poll from Data for Progress which found that 69% of respondents believe AI companies should be required to report energy usage. It calls for three priorities in the sector moving forward: transparency, safety and accountability.

Of Greenpeace’s priorities on this, Cray tells Power Technology: “We would like to see regular, public reporting of energy and resource use, disinformation threats, as well as environmental and social justice impacts of AI technology. We want to see risks to the public identified and for the industry to be accountable for developing a proactive plan to mitigate climate impacts, and for regulators to uphold reporting and operating standards.”

Uncover your next opportunity with expert reports

Steer your business strategy with key data and insights from our latest market research reports and company profiles. Not ready to buy? Start small by downloading a sample report first.

Newsletters by sectors

close

Sign up to the newsletter: In Brief

Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Thank you for subscribing

View all newsletters from across the GlobalData Media network.

close