Project description

This piece — titled ‘Untrue-Tube: Monetizing Misery and Disinformation’ is a responsive data investigation using a combination of a Youtube’s API, network analysis, graph visualizations, and open data. It uncovers the vast network of ‘recommended’ videos to which YouTube users are exposed after searching for the Parkland—and previous American mass shooting incidents, including Sandy Hook—trigger term ‘crisis actor’. This keyword was associated with a large number of false conspiracy videos that targeted students following he Parkland, Florida mass shooting incident. The piece is a rich visual and topical exploration of YouTube’s ‘rabbit hole’ conspiracy ecosystem through the perspective of the search results and ‘up next’ videos. The piece isn’t just data presentation, as the discussion around the evidence involves a cultural critique and concise commentary related to algorithmic transparency, opaque content promotion and marketing tactics, and YouTube platform monetization.

What makes this project innovative?

The published network graphs (ie, 'maps') obtained from YouTube's API based on seeded 'recommend videos' show how mass shooting debunking videos are immediately followed by 'false flag', provocative videos with soft questioning aka 'fake debunking', suggestive conspiracy videos, and a range of other videos concerning topics such as pedophilia, the illuminati, world religions and ethnic groups, and institutions and governments. This project also showed the presence of influential peripheral nodes such as Infowars as central nodes in spreading disinformation across the larger YouTube ecosystem. This project was a both timely and focused 'data journalistic response' to a wave of 'trending' conspiracy videos spreading false information and claims about the students in the Parkland shooting.

What was the impact of your project? How did you measure it?

This investigation showed that conspiracy videos on YouTube are a well-established, if not flourishing genre. Scores of news reports—including a New York Times piece and all the stories linked above—followed the initial Buzzfeed and Washington Post pieces on this YouTube 'conspiracy ecosystem'. From the perspective of Buzzfeed Senior Reporter Charlie Warzel:
'A review of the raw data that Albright shared with BuzzFeed News shows how YouTube's recommendation algorithm can push a user deeper into the murky world of conspiracy theories. Albright's initial search for "crisis actor" videos initially surfaces videos about the Parkland children, but then quickly branches off into recommended videos for dozens of other popular conspiracies about subjects including 9/11, the JFK assassination, Waco, the Oklahoma City bombing, Pizzagate, the Illuminati, chemtrails, vaccines, Freemasons, and the Sandy Hook, Aurora, and Las Vegas shootings.' Likewise, sociologist Zeynep Tufeci noted, 'Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax. What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism....'

Notably, two weeks after this data investigation was published and covered by the Washington Post, Buzzfeed, NBC News, Quartz, the Daily Dot, and others, YouTube CEO Susan Wojcicki announced at this year's South by Southwest that YouTube would being linking conspiracy-themed videos to related topical Wikipedia articles as a fact-checking/debunking measure. While admittedly not the ideal response, the shift in policy in response to press coverage and data evidence shows a quick reaction at the organizational level for a company that has thus far seemed to be only reactive (ie, take down videos and channels) to the social and cultural impacts of their ranking algorithm and powerful recommendation system.

Source and methodology

The work here is very straightforward and easily reproducible: based on 250 videos returned through YouTube's API for 'crisis actors', I gathered the “next up” recommendations for each of the first group of results (ie, 'seeds'). In all, I gathered a list of approximately 9,000 conspiracy-themed YouTube videos. The full data results were posted (shared as csvs) on Data.World.

Technologies Used

Current YouTube API, Gephi, DMI YouTube tools, Google Sheets, and Data.World to share data results (csv files)

Link

Additional links

Project owner administration

Contributor username

Followers

Click Follow to keep up with the evolution of this project:
you will receive a notification anytime the project leader updates the project page.