Casual readers of The Wikipedia Review, in combing through its thousands of topics, are often surprised to find relatively little material about what many reporters, bloggers, and other commentators usually perceive to be Wikipedia’s biggest problem: namely, the many inaccuracies, and occasionally outright falsehoods, that are sometimes found in various Wikipedia articles.
There are several reasons for this. Among our members there are those who support Wikipedia, and those who oppose it. The supporters don’t particularly want to draw attention to inaccuracies, and would rather simply fix them - but the opposers don’t always want to draw attention to them either, because then the inaccuracies would be fixed by others, and it’s always more interesting to see how long the problem lasts than it is to point it out. Moreoever, the mainstream media already does a fairly decent job of finding mistakes, or more likely being informed of mistakes, and reporting on them. Last but not least, inaccuracy stories are really not all that interesting after you’ve read several dozen of them, all making the same points about Wikipedians’ lack of expertise, lack of editorial oversight, and lack of any number of other things.
Wikipedians, of course, have a set definition of what constitutes “responsible criticism”: Responsible criticism is anything that focuses on isolated incidents of inaccuracy, or perhaps certain types of bias, without trying to delve too deeply into the question of why any given inaccuracy or bias-related incident occurred.
By their reasoning, most problems with articles should properly be blamed on “vandals,” whose work is usually “reverted” within a short period of time, sometimes in less than a minute. And they’re doing things to help correct the more subtle inaccuracies too, in the form of increasingly rigid sourcing and citation policies, so-called “fact-tagging,” formal content-dispute mediation processes, and so on.
These are all good things, to be sure, and to be fair, most Wikipedians are conscientious, well-meaning people who just want to make their website as nice and shiny as they can. But the key point here is that all of these problems can be solved by growth in users, increased participation, and increased editorial and administrative control. Here at Wikipedia Review, we prefer to focus on irresponsible criticism, which to a diehard Wikipedian often means the mere mention of anything that can’t be solved except by changing the rules, changing the software, and above all, changing the people who are running - and in some cases, ruining - the show.
Among the more important problems Wikipedia hasn’t really been able to solve:
- Administrator burnout and attrition
- Multiple-account use (and abuse)
- Incomprehensible priorities in development of new software features
- Lack of accountability due to systemic anonymity
- Plagiarism and misappropriation of images
- Misuse of administrative tools and privileges
- The endless, agonizing “inclusionist” vs. “deletionist” debate
- “Gaming the system” by skilled, clever manipulators
- User expert-credential fraud
The list goes on, but you get the picture. What happens when the labor supply runs out? Or the money to run the servers? Where does the money come from? Who’s responsible if someone gets seriously libeled? Who’s getting enriched by Wikipedia, and who’s getting impoverished by it?
None of this has much to do with the fact that for several weeks, an article about some little town in Canada had it being the drug capital of the world, full of drunks, or smelling like a “giant pit of rat dung.”
Someday, probably someday very soon, intrepid journalists will start asking some of those questions. Until then, there’s always Wikipedia Review, eh?