YouTube Will Link Directly To Wikipedia To Fight Conspiracy Theories
Image Credit: FilmMagic / Getty
Original Article | Author: Louise Matsakis
AFTER THE MASS shooting in Parkland, Florida, in February, the top trending video on YouTube wasn’t a news clip about the tragedy but a conspiracy theory video suggesting survivor David Hogg was an actor. The video garnered 200,000 views before YouTube removed it from its platform. Until now, the company hasn’t said much about how it plans to handle the spread of that sort of misinformation moving forward. On Tuesday, however, YouTube CEO Susan Wojcicki detailed a potential solution. YouTube will now begin displaying links to fact-based content alongside conspiracy theory videos.
Wojcicki announced the new feature, which she called "information cues," during a talk with WIRED editor-in-chief Nicholas Thompson at the South by Southwest conference in Austin, Texas. Here’s how it will work: If you search and click on a conspiracy theory video about, say, chemtrails, YouTube will now link to a Wikipedia page that debunks the hoax alongside the video. Here's another example: A video calling into question whether humans have ever landed on the moon might be accompanied by the official Wikipedia page about the Apollo Moon landing in 1969. Wojcicki says the feature will only include conspiracy theories right now that have "significant debate" on the platform.
Image Credit: YouTube
"Our goal is to start with a list of internet conspiracies listed on the internet where there is a lot of active discussion on YouTube," Wojcicki said at SXSW.
The decision to include links to other websites represents a dramatic shift for YouTube, which has historically existed as a mostly contained ecosystem. It’s also notable that YouTube chose to link out to text-based sites, rather than rearrange its own search algorithm to further favor content from truthful creators and video journalists. One reason for the decision might be that YouTube wants to avoid the perception that it’s rigging its platform to favor certain creators, a criticism it has faced in the past. It also prevents YouTube from having to censor content outright, serving as the ultimate arbiter of truth.
"People can still watch the videos, but then they have access to additional information," said Wojcicki.
Merely placing links to factual information alongside videos won’t solve the company’s moderation problems wholesale. For one, as Zeynep Tufekci at The New York Times and others have pointed out, YouTube’s recommendation algorithm is often how users end up seeing conspiracy theories in the first place. Wikipedia in particular can also be edited by anyone, and its own reliability issues of misinformation.
The problem with the recommendation algorithm is that it feeds users ever-more extreme content, sometimes straying from what they searched for in the first place. For example, if you search for a video about the Holocaust, YouTube might recommend that you then watch one about how the tragedy was a hoax. The recommendation system isn’t designed to ensure you’re informed; its main objective is to keep you consuming YouTube videos for as long as possible. What that entails has mostly been an afterthought. Even if every conspiracy video is served up with a Wikipedia article contradicting the information that it presents, there's no guarantee that users will choose to read it over the video they've already clicked on.
Take, for example, what happens when you search conspiracy theorist Alex Jones’ videos about the Parkland shooting. After watching one, YouTube recommends you then watch another of Jones’ videos, this time about how the Sandy Hook shooting was a hoax. It doesn't suggest that you watch factual clip about Parkland or Sandy Hook at all. YouTube’s algorithm system serves to radicalize users, and until that’s fixed, the company will likely continue to suffer from scandals related to misinformation.
YouTube has also still yet to decide and implement clear rules for when uploading conspiracy theory content violates its Community Guidelines. Nothing in the rules explicitly prevents creators from publishing videos featuring conspiracy theories or misleading information, but lately YouTube has been cracking down on accounts that spread hoaxes anyway.
In the wake of the Parkland shooting for example, YouTube reportedly issued a “strike” against Jones for uploading a video accusing Hogg of being an actor (this video was separate from the one that trended on the platform). But Jones and his organization InfoWars have been uploading videos to YouTube prompting lies, hate speech, and false conspiracy theories for years, leaving YouTube’s users and creators to guess what’s actually permitted. Often it seems the platform reacts primarily in response to public outcry, which makes its moderation decisions inconsistent. Until YouTube has outlined a clear policy for how it wants to regulate misinformation, its new efforts to introduce text-based links won’t entirely be effective.
Merely serving up factual information has also not been a cure-all for other platforms that have suffered from scandals associated with misinformation, like YouTube’s parent company Google and Facebook. Both Google News and Facebook’s trending bar have surfaced conspiracy theories during breaking news events in the past, despite having plenty of links to more reputable news sites on their platforms. It's remarkable, too, that an enormous platform, equipped with a flow of advertising cash, has chosen to address its misinformation problem primarily using the work of a donation-funded volunteer encyclopedia.
Another obvious question here is whether Wikipedia and YouTube will be able to keep up with with breaking news events that quickly fall prey to conspiracy theories. For example, the Parkland shooting survivors were accused of being actors within hours of the tragedy. It's unclear how quickly YouTube will be able to add links to the thousands of misinformation videos that are uploaded every time a major news event occurs.
Still though, YouTube should be applauded for doing something to try to fight conspiracy, especially since adding links elsewhere will do nothing to immediately aid its bottom line.