Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree this is an interesting take and fun to think about. But if clocks rounded to the nearest minute instead of truncating, the average error would be still be greater than 0 I think, and not 0, as the author claims. Assuming a sufficiently large number of measurements taken at random times, I think the average error would be 15 seconds.


The author meant "average error" in the systematic error sense, not the standard error sense. The minute display would be well calibrated, not precise to less than a minute.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: