MethodologyContact usLogin
Over the past couple of years, the copper industry has been increasingly enthusiastic about data centers – the computer server hubs that collect, store and process large amounts of information – as a new source of demand growth.
That’s because copper is used in the extensive network infrastructure and electric power supply that data centers require. The metal is found in power cables, busbars, electrical connectors and power distribution strips, and is a thermal conductor in the heat exchangers needed to cool the servers.
The amount of copper being used in data centers is not insignificant as a result. Back in 2009, industry pundits estimated that Microsoft used 2,177 tonnes of copper to build its $500 million data center in Chicago, equivalent to 27 tonnes per megawatt of applied power.
Given the number of data centers springing up around the globe and the additional power requirements that AI-ready server racks have, the amount of copper used has been growing.
A recent report by Macquarie estimates that between 330,000 tonnes and 420,000 tonnes of copper will be used in data centers by 2030, with a mid-point of 375,000 tonnes.
Macquarie said that its figure takes into account the various data center announcements made by Microsoft and Meta in January, as well as the massive $500 billion Stargate Project to build OpenAI infrastructure in the US.
The figure is also based on a forecast increase in required power capacity from 77 gigawatts in 2023 to 334 GW in 2030, plus the continued use of copper for data transfer in the data center itself, acknowledging that fiber optics are now the preferred connection choice.
But it didn’t make any adjustments for the new DeepSeek world, and this is where the debate over copper begins.
It all comes down to whether you believe the efficiencies of DeepSeek’s AI methodology means it needs significantly less compute power to run, as the company has said. It also depends on whether you think DeepSeek’s chain-of-thought, or reasoning, capabilities require considerably more compute power for inference – the process it uses to draw conclusions.
In other words, whether you accept that DeepSeek is a budget version of AI in which the training is admittedly more efficient, but its inference is not.
If you’re one of the Western tech companies that has been developing AI, then you are firmly in the believers’ camp. Not only that, but you probably argue that efficiency in AI does not guarantee reduced compute demand (and less copper) either.
That’s exactly what Satya Nadella, Microsoft’s chief executive officer, did last week when confronted with the prospect of DeepSeek.
He cited the Jevons Paradox, the concept that technological progress making a resource cheaper or more efficient to use often leads to an increase in demand for that resource.
“Jevons paradox strikes again!” Nadella posted on X (formerly Twitter). “As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can’t get enough of,” he added.
Copper bulls had better hope that he’s right.
In Hotter Commodities, special correspondent Andrea Hotter covers some of the biggest stories impacting the natural resources sector. Read more coverage on our dedicated Hotter Commodities page here.