Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems clearly false.

Let's say the page with the best information about dogs took an infinite amount of time to load (never loaded). I don't think you'd want that page to be ranked highly.

What about if that page took a year to load?

What about if that page took an hour?

A minute?

What's your cutoff? Let's say, generously, that you're willing to wait 30 minutes for the page to load. Why such a sharp cutoff? Why do pages that take 30 minutes and 1 second to load get penalized, but pages that take 30 minutes are treated as fine? The user experience is basically just as acceptable (by your standards) / just as bad (by my standards).

Perhaps maybe instead of a sharp cutoff we need some sort of sliding scale, where pages that are only slightly slower than optimal are penalized only slightly, and pages that are much slower than optimal are penalized more.

Gosh, that sounds familiar.



There's a cutoff built into browsers in the form of a KeepAliveTimeout. A longer load time could have a valid reason, that's not for search engines to decide.

Also your example of a year is obtuse. It's like saying you should drive BMW because you will get killed in a Tesla if you crash at a million mph.

What Google's page speed factor does is differentiate between 1.5 and 2.5 seconds loading time. That has nothing to do with the quality of a page.


Your assumption is that the time factor can't be handled by the user. Users try somewhere else when a page doesn't load, and in my opinion that is fine. Granted, it's my opinion. Perhaps most users don't agree and here we are.

If it is truly the best information, it is what I want and I will wait. If I trust that Google can give me the best information then I would be willing to wait. I give up easily now because I can't trust Google to give me quality results. I will not wait for an ad bloated news aggregator.


When a user does a search, clicks on a result, that result does not have what they want so they hit back, and then they click on another result that is a failure of the search engine. "Users can decide the site is worthless after they click" is an opportunity for somebody to come up with a better system.


I guess I simply don't agree. The search engine should give me the best information, period. It does not, and one of the reasons it does not is because it is busy prioritizing things that aren't relevant to content.


That example doesn’t reflect the real world differences in page load time, however. IMO the load time arbitration doesn’t belong in the “search” engine. If your search engine fetches 10 fast loading pages but lacking the depth of information you need it loses utility.

The tricky thing is also the fact that Google does not inform the user that it is prioritising the faster loading pages instead of the pages with the most accurate match. This is a little dishonest too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: