Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Economies of scale often get mentioned as if it is some sort of magical property that makes things cheaper. Reality is often far more complicated. Amazon, for example, hasn't really had an improvement in fulfillment costs over time. They've gotten more expensive as they've pushed ideas that grow the company: faster shipping, more products, etc. Also, resource constraints tend to have overwhelming impacts: the cost of extracting oil from the earth got a lot cheaper on economies of scale right up to the point where they had to go after more expensive sources to get enough to fill demand. Economists have a name for this effect: diseconomies of scale.

It is pretty safe to say that economies of scale has gotten us to where we are now with batteries. But that does not necessarily mean the trajectory will continue.



But the postal system already had immense economy of scale. If Amazon had to bootstrap its own fulfillment from scratch, the per package cost absolutely would have been slashed as volume increased.

To put some numbers on it, imagine you start off delivering one package by hand a week, literally by hiring someone to drive across the country. This might cost $2000 a week in fuel, salary, depreciation etc.

Now say we're delivering a billion packages a week. The rule quoted would put it at $4 per package.

There's wiggle room to argue each end, but considering it's a rule of thumb operating over nine orders of magnitude, that it's not outright nonsensical is actually pretty good.

I agree it's not some iron law, and it's important to know when and where it can be applied.


There's also Jevons Paradox[1]. This is that the more efficiently a resource is used the more demand there is for it. So if you increase the efficiency of electric vehicles, there will be more demand for battery packs because the battery pack's use value for transportation increases. For example, if there were technological improvements to Tesla that made them get 50% more range off the same weight of battery packs, that would greatly increase the demand for those battery packs because the value of those battery packs would be greatly increased.

[1]https://en.wikipedia.org/wiki/Jevons_paradox


Ah, I've been looking for a name for this.

So this applies to software systems as well. We had a system where, every time we optimized the underlying database, it became slower . This puzzled us for a while. Turns out that people were used to the slowness. When they found out that the system became faster, they actually started entering more transactions. As more people noticed, the system got overwhelmed again. Until it reached a point where it could finally handle the load.

The variable we failed to see is that people were leaving work earlier now. Even though the system's metrics appeared to degrade, that was a metric that markedly improved. They used to work longer hours to get all their work done.


> We had a system where, every time we optimized the underlying database, it became slower.

One thing I've noticed is that the amount of time I have to wait when performing operations has remained more or constant through the past few decades, even as processor instruction retirement rates skyrocketed and memory/storage/network latencies plummeted. I often find myself trying to pull up a site to do something like track a project or buy a ticket, only to be waiting for 5 to 10 seconds before I am able to read the information and/or perform the operation I was there for.

I always ask myself, "Why on earth am I having to wait at all for this?" I assume it's the same concept of "induced demand" for roadways. Under that theory, in densely-populated areas, traffic will always be as bad as it is, regardless of the number of lanes. Add more lanes, and commute times drop. Then people who didn't tolerate the longer commute times before jump in and start commuting, increasing the load on the roads. This quickly raises the wait time to just below the previous wait time, with all the people in the "margin of toleration" who weren't commuting before now commuting.

In the case of the systems at my company for project tracking, I just assume that people build systems on dependencies in a way that trades latency with the convenience of using a particular API/service with an SLO. The system with the SLO costs money to maintain, and so they have a quota system in place based on priority. The project tracking software comes along and says, "Well, we could write the data into a Prostgres instance with a 10 millisecond response time, but then we've got to worry about backup/restore, availability, and all that stuff. Or we can be the lowest priority traffic for this nifty distributed storage backend that has a support team and SLO with a 7 second response time." The seven seconds of each person's life every time they use the tool is an externality, and it's something few people are ever going to complain about. So, 7 seconds it is.

I'm resigned to waiting an average of 5 or so seconds for anything I do on a computer for the rest of my life, no matter how cheap cycles, storage, or bandwidth get.


This equation has very interesting implications.

E. g. more lanes translates to higher automaker profits at expense of less overall population happiness.

Faster computers translate to higher overall rate of software use at expense of more overall time spent waiting


This is precisely what happens with Mobile Networks. The faster the mobile network gets, the more people uses it which causes things to slow down to zero return of performance.

Ultimately though both Economy of Scale and Jevons Paradox reaches Law of Diminishing returns. Adding more capacity to network or faster DB queries will no longer yield any meaningful output.


For a more direct example with future actors:

Cell phone battery life has stayed mostly stable, even as battery capacities have gone up and processor energy/instruction has gone down. This is because mobile device performance is mostly power-constrained: manufacturers have a target of about a day of battery life as the minimum consumers will put up with, and whenever more energy is available they'll turn up performance or add features until battery life goes back down to that minimum.


I think you have to choose a limited perspective to make a diseconomy of scale for batteries.

CPUs, RAM, Energy and Energy storage have each reached the economy of scale point where innovation is heavily rewarded.

Single core, static dram, Oil, and lithium batteries should each fail at some point soon.


Your examples are services, but the argument is about economies of scale for manufacturing.

As for resource constraints, don't forget substitution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: