Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A further problem is that Wikipedia is chock full of nonsense, with a large proportion of articles that were never fact checked by an expert, and many that were written to promote various biased points of view, inadvertently uncritically repeat claims from slanted sources, or mischaracterize claims made in good sources. Many if not most articles have poor choice of emphasis of subtopics, omit important basic topics, and make routine factual errors. (This problem is not unique to Wikipedia by any means, and despite its flaws Wikipedia is an amazing achievement.)

A critical human reader can go as deep as they like in examining claims there: can look at the source listed for a claim, can often click through to read the claim in the source, can examine the talk page and article history, can search through the research literature trying to figure out where the claim came from or how it mutated in passing from source to source, etc. But an AI "reader" is a predictive statistical model, not a critical consumer of information.



Just the other day, I clicked through to a Wikipedia reference (a news article) and discovered that the citing sentence grossly misrepresented the source. Probably not accidental since it was about a politically charged subject.


> many that were written to promote various biased points of view, inadvertently uncritically repeat claims from slanted sources, or mischaracterize claims made in good sources.

Yep.

Including, if not especially, the ones actively worked on by the most active contributors.

The process for vetting sources (both in terms of suitability for a particular article, and general "reliable sources" status) is also seriously problematic. Especially when it comes to any topic which fundamentally relates to the reliability of journalism and the media in general.


A future problem will be that the BBC and the rest of the Internet will soon be chock-full of nonsense, with a large proportion of articles that were never fact checked by a human, much less an AI.


Wikipedia is pretty good for most topics. Anything even remotely political somewhere however, it isn't just bad, it is one of the worst sources out there. And therein lies the problem, its wildly different levels of quality depending on the topic.


Wikipedia is bad even for topics that aren't particularly political, not even because the editor was trying to be misleading but rather was being lazy and wrote up their own misconception and either made up a source or pulled a source without bothering to actually read it. These kind of errors can stay in place for years.

I have one example that I check periodically just to see if anybody else has noticed. I've been checking it for several years and it's still there; the SDI page claims that Brilliant Pebbles was designed to use "watermelon sized" tungsten projectiles. This is completely made up; whoever wrote it up was probably confusing "rods from god" proposals that commonly use tungsten and synthesizing that confusion with "pebbles". The sentence is cited but the sources don't back it up. It's been up like this for years. This error has been repeated on many websites now, all post-dating the change on wikipedia.

If you're reading this and are the sort to edit wikipedia.. Don't fix it. That would be cheating.


> If you're reading this and are the sort to edit wikipedia.. Don't fix it. That would be cheating.

Imagine if this was the ethos regarding open source software projects. Imaging Microsoft saying 20 years ago, "Linux has this and that bug, but you're not allowed to go fix it because that detracts from our criticism of open source." (Actually, I wouldn't be surprised if Microsoft or similar detractors literally said this.)

Of course Wikipedia has wrong information. Most open source software projects, even the best, have buggy, shite code. But these things are better understood not as products, but as processes, and in many (but not all) contexts the product at any point in time has generally proven, in a broad sense, to outperform their cathedral alternatives. But the process breaks down when pervasive cynicism and nihilism reduce the number of well-intentioned people who positively engage and contribute, rather than complain from the sidelines. Then we land right back to square 0. And maybe you're too young to remember what the world was like at square 0, but it sucked in terms of knowledge accessibility, notwithstanding the small number of outstanding resources--but which were often inaccessible because of cost or other barriers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: