Data center consume around 2% of the world’s electricity, translating to about 1% of energy-related greenhouse gas emissions.
Streaming Netflix, storing things in the cloud, and meeting on Zoom are just a few online activities that feed machines’ appetite for energy. But perhaps the biggest culprit of all is artificial intelligence.
AI models require immense computing power to train and run, particularly for machine learning and deep learning tasks. As a result, the Global Energy Agency (IEA) expects that data center energy use will double by 2026.
No matter how you look at it, data center energy use is a looming climate problem. So what can we do about it?
Immersion cooling
Big tech companies like Nvidia are investing in more energy-efficient hardware, such as specialized AI chips designed to reduce power consumption for specific tasks. However, cooling must be considered to address data center power consumption.
Server cooling alone accounts for about 40% of data center power consumption. Startups like Asperitas from the Netherlands, Submer from Spain, and Iceotope from the UK think they have the answer: submerged servers.
Technically, it’s not water at all, but a non-conductive dielectric fluid that absorbs server heat much better than air. Heat is then transferred from the liquid to the cooling system. This method keeps servers cool without the need for fans or air conditioners.
Immersion cooling, as it is called, can halve data center cooling energy consumption, according to a study by the University of Groningen. Immersion cooling also allows servers to be placed closer together, reducing space requirements by up to two-thirds.
Barcelona-based Submer has raised over $50 million in funding and claims its technology can cut data center cooling costs by up to 99%. The startup and many others are looking to tap into this emerging market,
which, by one estimate, could grow 35-fold over the next 12 years.
While startups like Summer want to reduce energy consumption to excellent data centers, other companies are looking for ways to harness heat.
Repurposing data center heat
What do AI and swimming pools have in common? Of course, they both suck up a tremendous amount of energy. London-based startup Deep Green has found a clever way to combine the two.
Deep Green installs small data centers in energy-hungry locations like entertainment venues. Its system converts waste heat from computers into hot water for the facility.
In turn, the cold water from the center is used to cool servers in the cloud. The idea is that the host site receives free heat generated by Deep Green’s servers processing the data, which in turn receive free cooling.
German startup WINDCores is also looking to locate data centers. But instead of swimming pools, it places mini data centers inside wind turbines. The servers are powered almost exclusively by wind power and transmit data over fiber optic cables.
In Norway, a trout farm is powered by waste heat from a nearby data center, and in Stockholm, Sweden, some 10,000 apartments are heated by data center operator DigiPlex.
These weird and wonderful solutions will need to scale quickly if data centers’ electricity needs are to be reduced. In January, Deep Green raised a whopping £200m to heat 100 to 150 swimming pools across the UK.
But ultimately, we also need to start using computing power more intelligently.
“I find it particularly disappointing that generative AI is being used to do internet searches,
” Sasha Luccioni, a world-renowned computer scientist, recently told AFP. Luccioni says that according to her research, generative AI consumes 30 times more energy than a traditional search engine.
Striking a balance
In August, Dublin rejected Google’s offer to build a new data center, citing insufficient grid capacity and a lack of significant renewable energy sources to power the facility. Last year, data centers consumed 21% of Ireland’s electricity.
Dublin’s decision “is likely to be the first of many” that will eventually force new data centers to generate more clean energy, either on-site or nearby,
said Gary Barton, research director at data analytics firm GlobalData.
The scrutiny of the potential costs of the rise of AI-powered data centers comes amid growing criticism that tech giants are exaggerating their progress on climate change.
A recent Guardian investigation found that between 2020 and 2022,
actual emissions from data centers owned by Google, Microsoft, Meta, and Apple were around 662% higher than officially reported.
Barton said that balancing data center energy needs with climate and infrastructure constraints will be “crucial” for governments and service providers in the future.
In the future, placing data centers in space or powering them with fusion energy could solve some of these problems. However, making sure data centers (actually) run on clean energy and operate more efficiently is critical to achieving this balance.