select all

Wikipedia Is Not Going to Save YouTube From Misinformation

YouTube CEO Susan Wojcicki. Photo: David Paul Morris/Bloomberg via Getty Images

At a South by Southwest event yesterday, Wired editor-in-chief Nick Thompson asked YouTube CEO Susan Wojcicki about the proliferation of misinformation on the video site. When breaking news happens, opportunistic and/or malevolent users rush to post videos about whatever happened, and regardless of their veracity, are usually able to obtain prominent placement in the site’s search results. Then, a related-videos algorithm sends users to other videos with faulty information.

Wojcicki’s brilliant idea to fix this? Rely on Wikipedia. The site will link to the crowdsourced encyclopedia and other “fact-based” sites adjacent to videos promoting conspiracy theories. It’s a plan not too far off from Facebook’s reliance on third-party fact-checkers to audit flagged stories spreading around the platform.

“When there are videos that are focused around something that’s a conspiracy — and we’re using a list of well-known internet conspiracies from Wikipedia — then we will show a companion unit of information from Wikipedia showing that here is information about the event,” Wojcicki said.

There are more than a few problems with this plan, beginning with the fact that Wikipedia, while mostly reliable, is a website that anyone can edit, and which has, in the past, become a battleground over the veracity of contentious topics like climate change.

Maybe more to the point, why is YouTube relying on Wikipedia to do this work? The video site is a subsidiary of Alphabet, parent company of Google, one of the country’s richest and most powerful corporations. Wikipedia is a crowdsourced nonprofit that has to repeatedly hold donor drives in order to sustain itself, and also relies on volunteer labor to sustain itself. (Google confirmed that the company does contribute to the Wikimedia Foundation, which oversees Wikipedia.)

Having to supplement conspiracy-theory videos with third-party fact-checking is kind of embarrassing for Google. Having to rely on a volunteer-driven nonprofit is even more embarrassing. YouTube is only just one aspect of Google’s increasing reliance on Wikipedia, too: Google’s search engine now uses “featured snippets” to provide users with what is essentially the One Correct Answer. This is a multi-billion-dollar company whose consumer-facing product is, well, information.

YouTube and Google (and Facebook, etc.) insist, frequently, that they don’t want to tell you what to believe. But they elide the fact that they’re already determining what you encounter online in the first place. The distinction between “what you see” and “what you believe” in the context of a Google search results is, at best, arbitrary. “You decide for yourself!” is the prevailing attitude, despite the well-established fact that people rarely get past the top results on any query they enter or any list of recommendations provided. What use is a product that, when you ask it for facts and information, responds with anything less than the definitive?

Wikipedia Is Not Going to Save YouTube From Misinformation