Can Algorithms Really “Radicalize”?
YouTube is often described as an engine for "radicalization," in the case of violent extremism, but the term applies in other ways as well.
Hi! In my Substack newsletter, I’m writing weekly dispatches about technology and culture, focusing on how algorithms influence the ways we create and consume cultural artifacts online. If you enjoy this, please hit the like button above (it helps me get more exposure) and subscribe:
In a long piece in August, the New York Times explained how “YouTube Radicalized Brazil,” as the headline read. A young man starts off watching guitar videos but quickly gets put onto far-right extremist political rants by the platform’s recommendation engine, the way a librarian might hand you the next book to read. The articles notes that the writer and scholar Zeynep Tufekci has called YouTube “one of the most powerful radicalizing instruments of the 21st century.”
In March of this year, the Times’s Kevin Roose talked to YouTube’s chief product officer about “online radicalization.” 70 percent of the traffic on YouTube is driven by recommendations, making it clear just how influential the suggestions of the recommendation algorithm are, even though they might not look like much — a strip of thumbnails on the right side of your screen or the next video to auto-play.
The Daily Beast also labeled YouTube a “radicalization machine.” The Internet is “turning us all into radicals,” says MacLean’s. A would-be school shooter underwent “self-radicalization” online, according to The Middletown Press. One study identified Internet “hate highways,” series of increasingly under-the-radar online spaces that “radicalize individuals.”
I’ve been struck recently by the use of that term, radicalize, to describe and critique the process of consuming what an algorithm serves you or what the Internet allows you to access. The articles above seem to define radicalization as a passive process, more the fault of the recommendation engine than the consumers themselves — who, lately, often seem to be isolated white men who are indoctrinated into hate groups. They just can’t help it as YouTube progresses from Joe Rogan middlebrow to Alex Jones conspiracist.
Earlier in the past two decades, “radicalization” was most often deployed in the context of terrorism in the Middle East, first Al Qaeda and later ISIS. It described the ways that an otherwise average person could be driven to join extremist movements. Radicalization could happen through influence from a local religious figure, a group of peers, a radio broadcast, or, as the 2010s wore on, a video online. Plenty of people were “radicalized” by the U.S. military itself as it charged bloodily through Afghanistan and Iraq. Fox News “radicalizes” Americans.
Now that radicalization is talked about more in an online capacity than off, I have two concerns. The first is the meaning of the word in its limited context and the second is this veneer of passivity, as if we have surrendered our politics to The Algorithm.
YouTube is most closely linked with radicalization, per the headlines. Maybe Twitter radicalizes, too, as one of the company’s executives admits, but that’s more because of the unfiltered, unmoderated content on the site than the actual recommendation algorithm.
But we don’t say that anyone gets “radicalized” by, say, Spotify, right? In part this is because Spotify has more upfront gatekeeping (or curation) than YouTube, at least for now. The platform is largely made up of licensed music published by record companies, who presumably are less willing to take risks on extremist content. It’s a smaller, more policed pool of 30 million songs instead of 5-billion-plus videos that anyone can upload.
Still, maybe we do get radicalized by Spotify, in a way. If “radicalization” means “getting more and more extreme,” then it could also describe the process of delving into niches of culture that get increasingly obscure the farther you proceed down the pathways of recommendations.
Here’s a confession: I only recently started using Spotify that much. Its recommendations don’t have much data on me to work from, so the tips aren’t very accurate. But even as I started playing some mixture of Bill Evans and Ryuichi Sakamoto, it began sending me some pretty avant-garde ‘70s jazz — stuff that I personally like, but I’m sure other listeners could find repellant.
YouTube does the same thing for music that it does for politics: If you engage with extreme stuff, it’ll find ways to up the ante to keep you watching or listening. Similar to Spotify, on YouTube I’ve had the experience of moving from Brian Eno’s famous Music for Airports to the kind of ambient synth music that doesn’t sound like anything at all. We become desensitized by the overflow of content and thus seek and get fed something more hardcore.
This can be a good, interesting thing. Radicalization is in some ways key to developing an identity as a cultural consumer, like starting out with the equivalent of Harry Potter YA lit and progressing to Ursula K. Le Guin, or easy-listening to Miles Davis. Granted, radicalization of taste is not the same thing as radicalization of politics. I hope no one listens to Japanese Musique concrète and then starts launching conspiracy theories. But it is a kind of radicalization nonetheless.
Radicalization certainly happens via Instagram. Think of all the influencers propping up elaborate vacations with sponsorship deals and the aggregation accounts bombarding us with luxury interiors. We followers become more extreme in our imaginations and desires — we get radicalized — by this imagery, a development that is reinforced by the community that social networks create around such content. We share posts around, infecting each other with the desire for expensive linen, obscure streetwear, or trips to Bali.
The algorithms of digital content platforms are meant to radicalize. They are designed to seize on our most acute impulses and serve them up so we don’t get bored and take our attention elsewhere. But this brings up the second point, which is that we are not just passive consumers like so many fattened foie-gras ducks. The algorithms might suggest more extreme content, but it is the platforms’ owners who allow it to exist and the users who decide to click on it. The problem is ultimately human.
I actually agree somewhat with the YouTube executive that the Times interviewed, who notes that the recommendations serve both less and more extreme content at any given time. “It’s equally — depending on a user’s behavior — likely that you could have started on a more extreme video and actually moved in the other direction.”
In these narratives, ascribing The Algorithm alone the ability to “radicalize” revokes all agency we have as readers and listeners, which we should want to preserve. It also ignores the fact that it’s not recommendations that necessarily make someone an extremist, but society’s tendency toward racism, misogyny, hatred, and violence in the first place.
In this Substack newsletter, I’m writing weekly dispatches about technology and culture, focusing on how algorithms influence the ways we create and consume cultural artifacts online. If you enjoy this, please hit the like button below and subscribe: