4.3GW is about as much as Denmark consumes*.
“This growth is two to three times that of overall data center power demand CAGR of 11%,” according to the company. “One key insight is that inference loads will increase over time as more newly trained models are transitioned to production. Actual energy demand will heavily depend on technology factors including successive generations of servers, more efficient instruction sets, improved chip performance and continued AI research.”
Schneider Electric estimates |
Year 2023 | Year 2028 |
Total data center workload |
54GW | 90GW |
AI workload | 4.3GW | 13.5-20GW |
AI compared with total) |
8% | 15-20% |
training vs inference workload |
20% training 80% inference |
15% training 85% inference |
central vs edge workload |
95% central 5% edge |
50% central 50% edge |
The figures in the table come from a report in which Schneider explains relevant attributes and trends of AI workloads, and predicts resulting data center challenges.
“Guidance to address these challenges is provided for each physical infrastructure category including power, cooling, racks, and software management,” it said.
The report, ‘AI disruption: Challenges and guidance for data center design‘ can be downloaded without payment
*un-guaranteed Electronics Weekly calculation based on 2021 figures via Wikipedia 🙂