Hacker Newsnew | past | comments | ask | show | jobs | submit | more aquadrop's commentslogin

They literally killed this vaccine, so it's anti-vax by definition in result.


To be fair that chart only starts in 1900, so you can't see 18th century here. And in the wikipedia page of this chart it says "By 1900, bowhead, gray, northern humpback and right whales were nearly extinct, and whaling had declined". But also in the original phrase "peak of the whaling industry" speaks not just about number of whales killed, but size of the industry as a whole, amount of people involved. Because with new tech I suppose those massive killings in 1960's required much fewer people.


yeah, the real winners here seems to be lawyers.


When Texas removed mask mandate in March there were such claims as well ("it will be so bad, Texas is doomed"), but nothing happened and cases continued to go down as in rest of the country.


Cases don't matter, hospital admissions also show sign of wave going down. And that doesn't depend on people's mood.


They are both real.


What would be competitive angle? What percentage of users do you think would care if your app transfers not 445kb but 345kb (sincere question)? I just always was skeptical about "reducing sizes" just for its own sake, and not in the frame of improving performance or noticeably reducing latency for the user (which could be related to the size or not). Can't argue with the interesting part :)


Good question, and of course the difference between 445kb & 345kb may not be noticeable, but it would be if you compared it against 1/2/10mb etc.

The competitive angle is really delivering a fast/snappy/responsive experience on mobile or desktop. That in itself isn't going to save you, but it is an advantage if part of the sell is a high quality experience.

Re:interesting, for instance I just started experimenting with inlining all css/js into index.html (so far the js part is problematic), minifying classnames, and I'm curious if it'd be possible to even minify js module names etc.


If you plan for such model beforehand you already might be lowering overall efficiency. You know that you can't get the whole thing you want as one piece, but you plan "one third now, another third in a couple of months and the last third in more couple of months". Overall you might end up with twice the effort, since you had to account for that split, but in the context of kernel it still makes sense because of all the complexity and many people working on it simultaneously. And of course it also depends on "splitability" of the thing you're working on.


It's actually more efficient to do it this way. When you develop in a fork, you end up having to both keep rebasing on mainline, and then on submission, you might find that large parts of the code are the wrong approach or do not meet upstream standards, and need to be rewritten.

By planning for incremental merges, you ensure that your foundation is solid and acceptable and avoid wasted work.


Their method of estimation is not far from guessing and they even say themselves, that accuracy is "low".


Youtube since recently (or I just noticed it) has very good auto-CC, their voice recognition even works in noisier videos.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: