Wed 4th April 2012, 8:49pm
QUOTE(Emperor @ Wed 4th April 2012, 3:01pm)
I just tried to read some more. So boring. NewYorkBrad really needs his head examined. He should be billing $200/hour to deal with life-sucking crap like this.
I agree - as far as the discussion itself is concerned, "boring" and "life-sucking" are putting it mildly. But the subtext
here is what's interesting. The real
ongoing conflict on Wikipedia has never been between "inclusionists" and "deletionists," or even "free culture" vs. "content standards" (which to some of them amounts to the same thing). The real conflict is between bots and humans, and the people in the middle are the ones programming the bots, like Rich Farmbrough. It's a behind-the-scenes conflict that most people don't understand or even know about, but the winner (if there ever is one) is going to determine the long-term future of Wikipedia.
The reason for this is hard to make sense of, but to be as concise as possible, it goes like this. "Vandals" provide fuel for "vandal fighters," and human vandal-fighting is essential to Wikipedia because it allows the hierarchy of established users to give new, less-talented writers and "editors" a means of in-game reputational development, starting their addictions a-rolling while conveniently shunting them off into an activity that doesn't mess up any actual content or impinge on already-established territories. But most vandal-fighting activity has already been replaced by bots, thus cutting off a key avenue to recruitment. As the bots get "smarter," vandal-fighting will become increasingly difficult and less game-like, which will only further that process.
And yet Wikipedia can't afford to be seen as putting the brakes on bot development in general,
because that would look irresponsible, as if they're simply "letting the vandals win." So, their solution is to paint the bot developers
as "irresponsible" themselves, effectively putting on those brakes while also addressing (albeit dishonestly) their real problem.
As humans, we tend to root for humans in human-machine conflicts - but I'm afraid this is a case where the humans are the bad guys. The machines/bots (and their developers), while also bad guys, are at least doing the right thing - even though it's probably for the wrong reason. And, of course, their mistakes are also magnified by virtue of sheer volume.