YouTube's Recommendation Algorithm Has a Dark Side

It leads users down rabbit holes

Thomas Pitilli

It was 3 A.M., and the smoke alarm wouldn't stop beeping. There was no fire, so I didn't need to panic. I just had to figure out a way to quiet the darn thing and tamp down my ire. I had taken out the battery and pushed and twisted all the buttons to no avail.

Luckily for me, the possible solutions were all laid out in the YouTube tutorial I found. The video helpfully walked me through my options, demonstrating each step. And the fact that it had hundreds of thousands of views reassured me that this might work.

YouTube has become the place to learn how to do anything, from assembling an Ikea cabinet to making a Bluetooth connection with your earbuds. It is a font of tutorials, some very good, some meandering, some made by individuals who have become professionals at it and rake in serious sums through advertising. But many are uploaded by people who have solved something that frustrated them and want to share the answer with the world.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The native language of the digital world is probably video, not text—a trend missed by the literate classes that dominated the public dialogue in the predigital era. I've noticed that many young people start their Web searches on YouTube. Besides, Google, which owns YouTube, highlights videos in its search results.

“How do I” assemble that table, improve my stroke, decide if I'm a feminist, choose vaccinations, highlight my cheeks, tie my shoelaces, research whether climate change is real...? Someone on YouTube has an answer. But the site has also been targeted by extremists, conspiracy theorists and reactionaries who understand its role as a gateway to information, especially for younger generations.

And therein lies the dark side: YouTube makes money by keeping users on the site and showing them targeted ads. To keep them watching, it utilizes a recommendation system powered by top-of-the-line artificial intelligence (it's Google, after all). Indeed, after Google Brain, the company's AI division, took over YouTube's recommendations in 2015, there were laudatory articles on how it had significantly increased “engagement”: Silicon Valley–speak for enticing you to stay on the site longer.

These “recommended” videos play one after the other. Maybe you finished a tutorial on how to sharpen knives, but the next one may well be about why feminists are ruining manhood, how vaccinations are poisonous or why climate change is a hoax—or a nifty explainer “proving” the Titanic never hit an iceberg.

YouTube's algorithms will push whatever they deem engaging, and it appears they have figured out that wild claims, as well as hate speech and outrage peddling, can be particularly so.

Receiving recommendations for noxious material has become such a common experience that there has been some loud pushback. Google did ban a few of the indefensibly offensive high-profile “creators” (though not before helping them expose their views to millions of people), and recently the company announced an initiative to reduce recommending “borderline content and content that could misinform users in harmful ways.” According to Google, this content might be things like “a phony miracle cure for a serious illness” or claims that “the earth is flat.” The change, they say, will affect fewer than 1 percent of all videos.

While it's good to see some response from Google, the problem is deep and structural. The business model incentivizes whatever gets watched most. YouTube's reach is vast. Google's cheap and nifty Chromebooks make up more than half the computers in the K–12 market in the U.S., and they usually come preloaded with YouTube. Many parents and educators probably don't realize how much their children and students use it.

We can't scream at kids to get off our lawn or ignore the fact that children use YouTube for a reason: there's stuff there they want to watch, just like I really needed to figure out how to unplug that beeping catastrophe at 3 A.M. We need to adjust to this reality with regulation, self-regulation and education. People can't see how recommendations work—or how they're designed to keep eyes hooked to the screen. We could ask for no YouTube or “no recommendations” for Chromebooks in schools.

This is just tip of the iceberg of the dangerous nexus of profit, global scale and AI. It's a new era, with challenges as real as that iceberg the Titanic did hit—no matter what the video claims.

Zeynep Tufekci is an associate professor at the University of North Carolina, whose research revolves around how technology, science and society interact.

More by Zeynep Tufekci
Scientific American Magazine Vol 320 Issue 4This article was published with the title “YouTube Has a Video for That” in Scientific American Magazine Vol. 320 No. 4 (), p. 77
doi:10.1038/scientificamerican0419-77

It’s Time to Stand Up for Science

If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.

I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.

If you subscribe to Scientific American, you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.

In return, you get essential news, captivating podcasts, brilliant infographics, can't-miss newsletters, must-watch videos, challenging games, and the science world's best writing and reporting. You can even gift someone a subscription.

There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.

Thank you,

David M. Ewalt, Editor in Chief, Scientific American

Subscribe