Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've always wondered, when do you round up (or down) for orders of magnitude? For instance, is 6 close enough to 10 to say it's an order of magnitude larger, or does it need to be closer (say 8), or hit a hard floor of being at least 10 times larger before we colloquially say it's increased by an order of magnitude?

Completely ignoring the time variable, I don't think it's crazy to say something 6x bigger is an order of magnitude bigger. I'm curious what other people consider a cutoff for this though.



If we go by log10 of the number, 4 would be enough to round up to "10" as an order of magnitude.

That just doesn't seem right to me, though. My non-rigorous intuition would put the cutoff at 7.


An order on magnitude is a factor if 10. So half an order of magnitude is a factor of a square root of ten, or 3.16, because two half orders of magnitude are a full order of magnitude.

[Edit:] A nice consequence of that limit is that 3.16 and 3.14 are almost the same number, so pi sits nicely on the border between this order of magnitude and the next, so you can approximate it as either one or ten, depending on what makes your estimate look better.


I've always thought of "order of magnitude" as adding a zero, so anything less than 10 is a different order of magnitude.

Gets weird thinking about 9 being both one number less and one order of magnitude less than 10, while 99 is both 89 more and the same order of magnitude, but it's supposed to be a rough approximation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: