When Karen Bass, a congresswoman from Los Angeles, emerged in late July as a serious contender to be Joe Biden’s running mate, interest in her Wikipedia page exploded. By that time, the entry had grown to 4,000 words, been worked over by more than 50 different editors, and drew a weekly readership of 360,000. During that flurry of editing, a new section twice appeared below a list of offices Bass has held and legislation she has supported: “Controversy.” It described the “substantial controversy and criticism” Bass had received for her words upon the death of Fidel Castro in 2016, and cited a Fox News report.
Each time, less than an hour later, this addition would be gone—deleted by another Wikipedia editor. Anticipating there might be some pushback at the removal, the editor offered a simple explanation: “Fox News is not enough …”
In those few days, Americans first learning about this obscure potential vice-presidential candidate naturally turned to the internet to fill in the details: Googling her name, clicking on a link shared by a Facebook friend, or turning to Wikipedia. Yet where someone wound up getting their information about Bass—who leads the Congressional Black Caucus and had been speaker of the California State Assembly—is hardly a minor matter. It could make all the difference, because while the executives of Google, Facebook, and YouTube seem content to distribute any incendiary reporting that arrives over the transom, the administrators of Wikipedia keep trying to live up to their responsibility as a source for accurate information.
In an aggressive move that is anything but sitting back, a panel of Wikipedia administrators in July declared that Fox News would no longer be considered “generally reliable” in its reporting on politics and science, and in those areas “should be used with caution to verify contentious claims.” (Fox News articles on other topics were unaffected.) There simply were too many examples of misleading, inaccurate, and slanted reporting about science and politics for Wikipedia to pass on Fox News articles as part of a broader search for the truth.
And while the decision hasn’t exactly banished Fox News from Wikipedia on those topics—there are still thousands of links to Fox News articles that appear there—it deprives Fox News of the ability to frame how the public interprets political events and politicians on Wikipedia. The changes to Bass’s article that highlighted a Fox News-promoted controversy give a glimpse at the stakes involved.
The attitude of the large platforms toward Fox News couldn’t be more different from Wikipedia’s. Search Google News or YouTube or Facebook and you will find plenty of Fox News reporting on politics and science, and why not? Once you disregard the importance of accuracy and proportionality, Fox News is great for business. Its biased reporting slakes a thirst of a sizable chunk of the public. According to a tally of the top-performing links published on Facebook each day, a Fox News article was number one for three days of a recent seven-day span.
For a digital platform, Wikipedia is refreshingly old-school in its values. Operated by a nonprofit foundation, it certainly isn’t afraid to be boring. And while I, and others, may be quick to read into the political significance in the decision to minimize Fox News’s influence on Wikipedia, the administrators who announced the changed policy tend to play down the drama. One of those administrators, who is British and goes by the handle Lee Vilenski, took on the matter despite, or actually because of, his lack of interest in politics. His area of editing usually includes snooker and pool; the only Trump he referenced in a long email exchange with me is Judd, the 30-year-old world snooker champion from Bristol.
In Vilenski’s mind, the question didn’t require much heavy thinking: “We don’t have to assume that Fox is acting in good or bad faith—we simply need to assess if we can trust the information being provided. In this case, a lot of users suggested using our policies that it couldn’t be trusted enough to be ‘reliable’ for these two topics.”
The administrators made clear that they weren’t implementing policy on their own, but summarizing what the community believed as reflected in a monthlong debate that involved roughly 100 editors. In June, an editor made a formal request that Wikipedia look again at the decision to consider Fox News a generally reliable source. That original conclusion was made 10 years earlier, and clearly a lot had changed.
In the debate that followed, our current fraught times spilled out, of course. There were discussions of how Fox News enabled President Trump’s minimization of the dangers from the Covid-19 pandemic, while other big topics included persistent allegations of misinformation about climate change or the bogus claim of so-called “no-go zones” for non-Muslims in British cities like Birmingham.
Defenders of Fox News—and there were some—emphasized its willingness to eventually correct errors and portrayed its biases as a product of a two-party adversarial political system, with MSNBC allegedly just as biased in the other direction. They also pointed to misstatements on important topics like the threat from Iraq during the buildup to war by highly respected sources like The New York Times.
But ultimately Wikipedia opted for an earnest, rather than cynical, approach to reliability. It chose to believe there is such a thing as reporting without overt bias, just as it believes its encyclopedia publishes articles that are doing their best to be true. The question boiled down to: Could this particular community put its faith in this particular news organization so it could get busy producing an encyclopedia?
“With thousands of active editors at any given time, there must be a consensus on such matters or we would never get anything done,” wrote Primefac, another of the administrators. “Otherwise, we would squabble on everything, from which sources to use to how many spaces after a full stop.” Such earnestness, I hope, is what can save us from the digital nihilism around us. Or perhaps you might call it integrity.
With this latest decision, Wikipedia offers a promising model for digital platforms: rather than focus on the accuracy or social harm of an individual post—and then either remove it or offer some needed context—better to assess whether the creator of that post is interacting with the community honestly on certain subjects and allow or disallow their contributions accordingly. In other words, make the kind of judgments one does all the time as you establish a community group, build a book club, or write an article (for Wikipedia or anywhere else).
Incredibly, Facebook currently employs as a fact-checker an entity that is an affiliate of the Daily Caller. But if you look at Wikipedia’s guide to sources for its editors, you’ll find that it holds the Daily Caller in even lower esteem than Fox News. The source is marked with a stop-sign icon, which indicates that it “publishes false or fabricated information.”
More Great WIRED Stories
- The hate-fueled rise of r/The_Donald—and its epic takedown
- Hackers are building an army of cheap satellite trackers
- 13 Amazon Prime perks you might not be using
- My glitchy, glorious day at a conference for virtual beings
- What does it mean to say a new drug “works”?
- 🎙️ Listen to Get WIRED, our new podcast about how the future is realized. Catch the latest episodes and subscribe to the 📩 newsletter to keep up with all our shows
- 🎧 Things not sounding right? Check out our favorite wir