It's behind a paywall, but the gist of it is that so many bad power supplies are being connected to the grid that the sine waves are getting distorted. It's based on this research from Whisker Labs[1].
Better power supplies can help with the distortion, but it really shouldn't matter much, except in extreme cases where things are at 99% of their limits in the power grid, on a hot day.
By the way, you can learn interesting things from analysis of the power grid. Long ago I remember a Slashdot comment about using it with covertly gathered recordings estimate the effective yield of nuclear enrichment operations in the middle east. I've been trying to find that thread but Google isn't what it used to be
It's not useless to humanity, there are people who literally can't get bank accounts because of the industry they work in or the country they're born in.
> Is the problem essentially that too much of the load on the grid is AC->DC converters that are drawing power only at the peak of each cycle?
No, large power supplies used for AC/DC conversion don’t do this. They even already do power factor correction to correct the AC power factor.
The issue is that power factor can only be corrected so well (0.95 is typical best case), so if you keep adding more and more loads of the same type it will lead to excessive capacitive loading on a grid.
In comparison, industrial stuff (things that use lots of motors) have an inductive loading on the grid. Basically, if you have equally paired inductive and capacitive loads you can cancel out the “imbalance”. Industrial plants already do this with large capacitor banks.
Essentially all this means is that the power companies probably need to start forcing power factor requirements on data center customers. They’re already billed for it, but if it’s causing grid issues they can require it (or, implicitly dictate it through pricing structures).
Require that new data centers be powered by their own renewables/batteries instead of being attached to the grid. Problem solved. I'm surprised anyone building a data center is not doing this already.
Running data centers on the grid is the most economical and environmentally friendly option.
In the unhappy case of needing to generate power using fossil fuels, grid scale equipment like combined cycle gas turbines will bury the efficiency and emissions of any typical DC class generation capacity.
I read that solar is cheaper than fossil fuel generation. Also, building a solar+battery power source that does not require grid connection means you don't have to wait 5-10 years for an interconnect to be approved. Up time for the data center may be improved also. But even if it cost more, so what, make it a requirement anyway for new construction so it won't cause problems with the grid that everyone else uses.
Better power supplies can help with the distortion, but it really shouldn't matter much, except in extreme cases where things are at 99% of their limits in the power grid, on a hot day.
By the way, you can learn interesting things from analysis of the power grid. Long ago I remember a Slashdot comment about using it with covertly gathered recordings estimate the effective yield of nuclear enrichment operations in the middle east. I've been trying to find that thread but Google isn't what it used to be
[1] https://www.whiskerlabs.com/analysis-of-total-harmonic-disto...