On-Prem

Google's €1B Finnish datacenter expansion to heat the local community

AI might take your job, but you'll be toasty warm while you starve


Google plans to invest €1 billion to expand its datacenter campus in Finland – a move that will both bolster its AI compute capacity and reclaim the thermal energy generated by power hungry accelerators to heat local homes.

The Nordic region has several advantages for datacenter construction. Its cool climate makes techniques like "free cooling" possible for much of the year, improving efficiency while also reducing water consumption.

The area also benefits from an abundance of hydroelectric and wind power – an attractive prospect for cloud providers looking to scale their AI compute capabilities without compromising on their carbon emission reduction commitments. Then of course there are the often significant tax breaks aimed at attracting investment in the region.

Google has operated a datacenter campus in Hamina, along the Gulf of Finland, going back to 2009. If the Chocolate Factory is to be believed, 97 percent of the facility's power comes from renewable sources, like wind or hydro. Up to this point, heat generated by the facility has been captured and used to warm adjacent Google office buildings and facilities.

Beginning next year, Google plans to begin offloading the thermal energy generated by the facility to neighboring homes, through a partnership with local energy company Haminan Energia. Much of this heat will come from a €1 billion ($1.1 billion) expansion to the datacenter complex designed to bolster Google's AI compute capacity, it revealed on Monday.

The site's expansion is expected to create 500 jobs – including 400 contractor roles and 100 full-time positions – once the new bit barns are operational.

Waste heat from Google's datacenter campus in Hamina, Finland will provide 80 percent of the city's district heating demand, according to the Chocolate Factory – Click to enlarge

"Finland plays a critical role in building Europe's digital economy, and the strong tradition for innovation in Hamina and the Kymenlaakso region has been crucial to this," Joe Kava, VP of Google datacenters, gushed in a canned statement.

"Our continued investment in our datacenter in Hamina is a testimony to Finland's role as a digital frontrunner and will help to further unlock the potential of AI among companies in the Kymenlaakso region, across the country and in Europe."

AI inferencing generally benefits from lower latencies – meaning datacenters that run it can usefully be located near large population centers. Training new models isn’t latency sensitive, leading many to suggest that training clusters be built at higher latitudes in locations with cooler climates, an abundance of clean energy, and the potential for heat reuse.

For example, at Super Computing in Denver last year, HPE's Nicolas Dubé posited that training GPT-3 once could provide enough warm water to heat 4.6 greenhouses and produce just over a million tomatoes.

In the case of Google's datacenters outside Hamina, the search giant claims they'll supply 80 percent of residents' annual heating demands. This heat will be provided to the local community at no cost, Google declared in a blog post on Monday.

Waste heat from the project is expected to begin flowing into Hamina's district grid in late 2025.

Google wll not be the first to repurpose datacenter waste heat to warm a district. LUMI, Europe's largest supercomputer as of this northern spring, is located in Kajaani, Finland and provides 20 percent of the city's heating demand, for example.

However, as we've previously discussed, datacenter-fed district heating schemes aren't without challenges. In February, TechUK published a report on the concept that highlighted numerous hurdles to embracing the technology.

Chief among the challenges is that many existing heating networks may need to be modernized to support the waste heat generated by datacenters. Another issue is that when datacenters first open, they often operate at a fraction of their design capacity – meaning it could take years before the facility is ready to contribute meaningfully to district heating needs. ®

Send us news
19 Comments

Cloud computing hits the nuclear button amid energy crisis

Other options considered too as the power draw on electricity grids continues unabated

atNorth plans mega datacenter that will help grow veggies and heat homes

It'll also do computery stuff

Google’s Irish bit barn plans denied over eco shortfall

DCs on the Emerald Isle better be green, says Dublin council - unless your name is Microsoft

AI's thirst for water is alarming, but may solve itself

Its energy addiction, on the other hand, only seems to get worse

India misses ten million public WiFi hotspot goal, catchup not looking likely

Plus: Glowing reports for Fukushima wastewater; New datacenters in Fiji, Malaysia & South Korea; and more

Open Compute Project seeks standard for concrete, with help from AWS, Google, Meta, and Microsoft

There's a lot of CO2 in datacenters, and Big Tech has promised to get to net zero

AMD's Victor Peng: AI thirst for power underscores the need for efficient silicon

Moore's Law may be running out of steam, but there are still knobs to turn and levers to pull

Nvidia admits Blackwell defect, but Jensen Huang pledges Q4 shipments as promised

The setback won't stop us from banking billions, CFO insists

DataVita declares sovereignty with 'National Cloud' for UK

Scottish provider promises no hidden fees, full control, and safe haven for data

Copper's reach is shrinking so Broadcom is strapping optics directly to GPUs

What good is going fast if you can't get past the next rack?

Tenstorrent's Blackhole chips boast 768 RISC-V cores and almost as many FLOPS

Shove 32 of 'em in a box and you've got nearly 24 petaFLOPS of FP8 perf

Cerebras gives waferscale chips inferencing twist, claims 1,800 token per sec generation rates

Faster than you can read? More like blink and you'll miss the hallucination