Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The interesting macro view on what's happening is to compare a mature data center operation (specifically a commoditized one) with the utility business. The margins here, and in similar industries with big infra build-out costs (ex: rail) are quite small. Historically the businesses have not done well; I can't really imagine what happens when tech companies who've only ever known huge, juicy margins experience low single digit returns on billions of investment.




Worse, is that a lot of these people are acting like Moore's law isn't still in effect. People conflate clock speeds on beefy hardware with moore's law, and act like it's dead, when transistor density rises, and cost per transistor continue to fall at rates similar to what they always have. That means the people racing to build out infrastructure today might just be better off parking that money in a low interest account, and waiting 6 months. That was a valid strategy for animation studios in the late 90s (it was not only cheaper to wait, but also the finished renders happened sooner), and I'd be surprised if it's not a valid strategy today for LLMs. The amount of silicon that is going to be produced that is specialized for this type of processing is going to be mind boggling.

Cost per transistor is increasing. or flat, if you stay on a legacy node. They pretty much squeezed all the cost out of 28nm that can be had, and it’s the cheapest per transistor.

“based on the graph presented by Milind Shah from Google at the industry tradeshow IEDM, the cost of 100 million transistors normalized to 28nm is actually flat or even increasing.”

https://www.tomshardware.com/tech-industry/manufacturing/chi...


Yep. Moore's law ended at or shortly before the 28nm era.

That's the main reason people stopped upgrading their PCs. And it's probably one of the main reasons everybody is hyped about Risc-V and the pi 2040. If Moore's law was still in effect, none of that would be happening.

That may also be a large cause of the failure of Intel.


> Moore's law ended at or shortly before the 28nm era.

Moore's law isn't about cost or clock speed, it's about transistor density. While the pace of transistor density increases has slowed, it's still pretty impressive. If we want to be really strict, and say densities absolutely have to double every 2 years, Moore's Law hasn't actually been true since 1983 or so. But it's been close, so 2x/2yr a decent rubric.

The fall-off from the 2x/2yr line started getting decently pronounced in the mid 90s. At the present time, over the past 5-6 years, we're probably at a doubling in density every 4-ish years. Which yes, is half the rate Moore observed, but is still pretty impressive given how mature the technology is at this point.


If you want to be pedantic, the original (and revised) law are definitely about cost. The original formulation was that the number of features (i.e. transistors) on an integrated circuit doubled every two years for the best-priced chips (smallest cost per feature).

https://web.archive.org/web/20220911094433/https://newsroom....

> The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page).

And it is formulated in a section aptly titled "Costs and curves". This law has always been an economic law first, some kind of roadmap for fans to follow. But that roadmap drove almost-exponential investment costs as well.

I concede that density still rises, especially if you count " advanced packaging". But the densest and most recent is not the cheapest anymore.


s/fans/fabs/ (blame autocorrect)

A lot of it is propped by the fact with GPU and modern server CPUs the die area just got bigger

Does AWS count as commoditized data center? Because that is extremely profitable.

Or are you talking abour things like Hetzner and OVH?


The cloud mega scalers have done very well for themselves. As with all products the question is differentiation. If models can differentiate and lock in users they can have decent margins. If models get commoditized the current cloud providers will eat the AI labs lunch.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: