« The Art of Teaching Others to Write | Home | Revised ASME Guidelines »

Is Wikipedia a Reliable Source? Part II

Posted on Monday, March 30, 2015 at 11:43 AM

Editors find it convenient. But can it be trusted?

By William Dunkerley

--"We use it for background..."

--"It's a great starting point for research..."

--"I personally only use Wikipedia as a jumping off point..."

--"I use Wikipedia primarily for a quick check on information..."

These are a few comments last month's survey elicited from editors. Admittedly, I use Wikipedia a lot myself, too.

I remember some years ago offering statistics I picked up from Wikipedia while making a point to my physician. She responded, "Where did you get that from?" She sneered when I said Wikipedia. At the time I thought to myself that this doctor was behind the times in ignoring such a great new information resource as Wikipedia.

But after preparing this two-part series for Editors Only, I've learned to use Wikipedia with a great deal more caution. I don't trust it as much as I used to.

A fundamental premise of Wikipedia is that anyone can become an editor at will. Any such editor can enter new information or change text that is already there. Supposedly, through an ongoing process of editing and reediting by various people, a better encyclopedia article will eventuate.

That is probably a good premise if all the editors are doing is polishing the language so that it can be better understood by readers. Beyond that, the process can be problematic. This is particularly true when large segments of the editing population see facts differently.

Laying Down the Editorial Law

To try to keep things on the up-and-up, Wikipedia has three basic article policies: (1) no original research, (2) neutral point of view, and (3) verifiability.

"No original research" means that an editor can not enter information that comes from his or her own expertise or personal experience. So, for instance, if paleontologists discover bones from a prehistoric animal, they can't report that in Wikipedia themselves. But if a local newspaper comes over to do a story on the discovery, it becomes legitimate fodder for inclusion.

There are pros and cons to this approach. On one hand, it will mitigate against a self-appointed "paleontologist" claiming a scientific discovery when in reality all he found were some cow bones. On the other hand, it means that legitimate scientific information is subjected to the non-expert editorial handling of a local newspaper. None of the Wiki material is supposed to come from a primary source.

"Neutral point of view" sounds like a reasonable concept, too. An encyclopedia is no place for advocacy or proselytization. But in portraying a neutral point of view, Wikipedia editors are left to consider only point-of-view inputs that have already been published.

A recent example comes from Hillary Clinton's use of a private email server during her tenure as Secretary of State. Most media piled on Clinton, questioning whether her server arrangement was proper. A Wikipedia article on the topic said, "The scandal was the subject of the 'cold open' on Saturday Night Live." Note the use of the word "scandal." Those who are at political odds with Clinton certainly regard the issue as a scandal. Her supporters consider it an insignificant issue that is being overblown.

Then an editor other than the one who wrote the scandal statement changed the word "scandal" to "controversy." That seems to me a successful application of the neutral point of view rule, even though the preponderance of media coverage appeared to play to one point of view. Neutral is neutral, and preponderance of one particular point of view should not add up to neutrality.

A different example shows an unsuccessful application. In an article discussing the start of the still-ongoing Ukrainian crisis, one editor wrote, "Many protesters joined because of the violent dispersal of protesters [by the police]." Some contend, however, that it was the protestors who were violent and that the police were attempting to restore order. So those are two opposing points of view. A second editor changed the term "violent dispersal" to "forceful dispersal." That would seem to be descriptive without taking sides. But that edit was reversed by another editor, who called the "forceful dispersal" term "POV pushing, weasel-ing." The editor trying to introduce neutrality apparently gave up at that point.

I've noticed that there is a cadre of devoted Wikipedia editors. They do not seem to be subject matter experts, but they have gained expertise in the extremely complex Wikipedia rules and guidelines. They've developed their own culture and jargon. Some seem to have an earnest interest in improving the content, others appear to enjoy simply playing the role of Wiki-rule cops, and yet others use their extensive familiarity with Wiki rules to protect their own favored point of view from corrections attempted by novice editors. The deck is stacked against anyone who has not made Wiki editing a special avocation.

"Verifiability" is a concept that is essential for implementing the prohibition against primary source material (no original research). On this subject, writing in Technology Review (October 20, 2008), Simson Garfinkel explained: "Verifiability is really an appeal to authority -- not the authority of truth, but the authority of other publications. Any other publication, really. These days, information that's added to Wikipedia without an appropriate reference is likely to be slapped with a 'citation needed' badge by one of Wikipedia's self-appointed editors. Remove the badge and somebody else will put it back. Keep it up and you might find yourself face to face with another kind of authority -- one of the English-language Wikipedia's 1,500 administrators, who have the ability to place increasingly restrictive protections on contentious pages when the policies are ignored."

A Different Original Focus

My two articles about Wikipedia actually started off being developed as a series on reader-generated content. But I came to find so much that was disturbing about Wikipedia that I changed the focus.

The tipping point occurred when I wrote to the Wikimedia Foundation. (That's the organization that hosts Wikipedia and raises funds for it.) I invited their representative to share some information on their experience with audience-generated content.

The response I got said, "Unfortunately, we cannot accommodate your request due to current time constraints." It was signed by "Dasha Burns, On behalf of the Wikimedia Foundation."

So I had written to Wikimedia but received a response not from the organization, but by someone answering on its behalf. I checked the domain name in Burns' email address. It is "minassianmedia.com."

Out of curiosity I went to the website at that address. All that's there is a business card–type page with the company name, address, and phone number, plus the descriptive line "Content + Communications."

Still curious, I googled the company name. That brought me to a website of The Tribeca Disruptive Innovation Awards. It identified the president of Minassian Media as Craig Minassian.

And when I googled his name, what did I find? He is the chief communications officer at the Clinton Foundation. I wondered if this meant the Clinton family has Wikipedia under its thumb.

All this googling was being done around the time the NBC Brian Williams story was a big item. He had been caught exaggerating the danger he was in while visiting a world trouble spot.

I remembered that Hillary Clinton had been caught in a similar imbroglio over her visit to Bosnia, claiming to have exited her airplane amidst enemy fire. Video from that event proved her story to be false, like Williams' story.

How did the Wikipedia coverage of these similar predicaments compare, I wondered. Here's what I found: The Williams fib story takes almost 1,000 words to tell. It's full of clickable links that take you to other sites to substantiate the story.

Hillary's fib is covered in just 40 words and seems to really gloss over the fabrication. This Wikipedia entry has just one footnote. It's to a book published five years ago, and there's no clickable link to any content other than the authors' names and the page numbers, not even the name of the book. So it's like a dead-end reference.

Then I thought that the great difference in the fib coverage might be a result of the freshness of the Williams story, whereas Hillary's became news back around 2008. So I checked out another famous fib. It was that of Dan Rathers when he touted a false story about George W. Bush's questionable National Guard service. This was from 2004, older than Hillary's story. It got around 1,000 words in Wikipedia. So the short treatment of Hillary's fib seems to have nothing to do with how far back it was.

Certainly this is not an exhaustive examination of Wikipedia manipulation. But it sure looks suspicious.

Some Advice

In closing, here are a couple of tips you can use to gain insight into whether a Wikipedia article is reliable:

--First look on the "View History" page associated with the article. There you can see all the editing that's been done to the article in the past. Does it look like the edits were done to improve the article? Or do you see dueling editors trying to push one version of the facts over another?

--Also look on the article's Talk Page. There you'll see discussions that have gone on between various editors who have an active concern about the article. Do they seem to know what they are talking about? Or is the discussion merely at the level of enforcing Wiki editorial policies and rules, irrespective of whatever the truth may be? Some of those discussions that I've seen have lost touch with what the article's all about. That's a bad sign.

Most of all, though, my advice for safely using Wikipedia is this:

Caveat lector -- let the reader beware!

William Dunkerley is principal of William Dunkerley Publishing Consultants, www.publishinghelp.com.

Add your comment.

« The Art of Teaching Others to Write | Top | Revised ASME Guidelines »