This piece — titled ‘Untrue-Tube: Monetizing Misery and Disinformation’ is a responsive data investigation using a combination of a Youtube’s API, network analysis, graph visualizations, and open data. It uncovers the vast network of ‘recommended’ videos to which YouTube users are exposed after searching for the Parkland—and previous American mass shooting incidents, including Sandy Hook—trigger term ‘crisis actor’. This keyword was associated with a large number of false conspiracy videos that targeted students following he Parkland, Florida mass shooting incident. The piece is a rich visual and topical exploration of YouTube’s ‘rabbit hole’ conspiracy ecosystem through the perspective of the search results and ‘up next’ videos. The piece isn’t just data presentation, as the discussion around the evidence involves a cultural critique and concise commentary related to algorithmic transparency, opaque content promotion and marketing tactics, and YouTube platform monetization.
What makes this project innovative?
What was the impact of your project? How did you measure it?
'A review of the raw data that Albright shared with BuzzFeed News shows how YouTube's recommendation algorithm can push a user deeper into the murky world of conspiracy theories. Albright's initial search for "crisis actor" videos initially surfaces videos about the Parkland children, but then quickly branches off into recommended videos for dozens of other popular conspiracies about subjects including 9/11, the JFK assassination, Waco, the Oklahoma City bombing, Pizzagate, the Illuminati, chemtrails, vaccines, Freemasons, and the Sandy Hook, Aurora, and Las Vegas shootings.' Likewise, sociologist Zeynep Tufeci noted, 'Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax. What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism....'
Notably, two weeks after this data investigation was published and covered by the Washington Post, Buzzfeed, NBC News, Quartz, the Daily Dot, and others, YouTube CEO Susan Wojcicki announced at this year's South by Southwest that YouTube would being linking conspiracy-themed videos to related topical Wikipedia articles as a fact-checking/debunking measure. While admittedly not the ideal response, the shift in policy in response to press coverage and data evidence shows a quick reaction at the organizational level for a company that has thus far seemed to be only reactive (ie, take down videos and channels) to the social and cultural impacts of their ranking algorithm and powerful recommendation system.
Source and methodology