Regardless of who wins Tuesday’s presidential election, the impact of the 2016 race is sure to be felt long after the last votes are counted and the last yard signs thrown away. Among the important issues raised in this year’s campaign are questions any investigator can appreciate, namely, when is information credible and how can one tell.
Facts on Trial
The Washington Post recently published an interesting piece on the challenge of editing Wikipedia’s biographical articles on Donald Trump and Hillary Clinton. As an open-source encyclopedia, Wikipedia allows users to make edits to its millions of articles, relying on a team of volunteer editors to weed out misleading or downright false information that is added. In a campaign as contentious and downright nasty as this year’s you can imagine the Wikipedia editors have had their work cut out for them. According to the article, Wikipedia users have made nearly 12,000 edits to the Clinton and Trump articles since they announced their candidacies. That’s a lot of fact-checking!
Uncovering the Misleading Facts
The good news is that it appears the vast majority of Wikipedia’s volunteer editors take their job very seriously. For a free online encyclopedia, Wikipedia is remarkably accurate. Even so, it is susceptible to human error. And unlike a professionally edited encyclopedia, it’s also susceptible to what a team of researchers studying the site calls Wikipedia vandalism. These researchers, from the University of California at Santa Cruz, Polytechnic University of Valencia and the University of Pennsylvania, define Wikipedia vandalism as article modifications made in bad faith, such as those intended to mislead readers. In fact, these researchers reported in 2011 that 7 percent of Wikipedia edits qualify as acts of vandalism, a rate some might think of as low considering the potential for abuse, but one that is alarmingly high for those of us concerned about accuracy and high standards in reference publishing. After all, it’s not like those 7 percent of vandalism edits are marked clearly for all to see. They are lurking somewhere in Wikipedia’s vast store of information, hidden until its editors uncover and fix them.
Fortunately these same researchers helped develop a tool, called STiki, to help Wikipedia editors identify edits that may be vandalism. Wikipedia reports that this tool has helped it revert, or undo, more than one million questionable edits. However, even if the tool does flag a case of vandalism, there’s no guarantee it will be reverted promptly, meaning an unsuspecting user might still read the vandalized piece of information and take it as fact before it is fixed.
Do You Know Where Your Encyclopedia Article’s Been?
One way Wikipedia users can help protect themselves against instances of vandalism, or even innocent mistakes, is by clicking on the View history tab at the top of each article. There you will see a list of revisions made to the article over the article’s history. Reviewing more recent edits, in particular, can help you identify recent changes that may not yet have been reviewed by Wikipedia’s editors and which therefore may be questionable.
As investigators we would never rely on Wikipedia as a primary source of information. But, like a search engine, it can be a great tool to identify primary sources. I often use Wikipedia simply for the References section at the bottom of each article. While I know I can’t take information that appears in a Wikipedia article at face value, the information’s footnoted source is often a great next step in my research.
In investigations, as in politics, knowing the source of information is nearly as important as knowing the information itself. As fortunate as we are to live in an age where unprecedented amounts of information are a mere mouse click or screen swipe away, it’s also never been more important to understand where that information comes from.