If you haven’t noticed, Instagram has gotten terrible lately. My main feed is overwhelmed with recommended posts and videos from accounts that I don’t even follow. Instagram Stories, which was once a haven of real-time moments from friends, is now a similar melange of videos, ads, and brand promotions. (It knows I’ll swipe through on any real-estate ad.) The problem, I think, is that Instagram has gotten too algorithmic. The proportion of stuff I asked for in my feed versus stuff that I didn’t ask for and don’t want has gotten too low. It might be spiking engagement for Instagram as its parent company tries to make all of its products more like TikTok, but it alienates users.
Algorithmic recommendations are supposed to bring us what we like, but lately it feels more like they only deliver what is most expedient for the platform that they operate on. For Instagram, that means pushing video content instead of the photos it was originally designed for. This mismatch between desire and algorithmic outcome is becoming stark. I encounter it a handful of times every day: Google Maps redirects my driving route randomly because it thinks it found a better one; TikTok delivers me an avalanche of only cooking videos because I like a few of them; Twitter loads a list of only vaguely related tweets under each tweet I open. These glitches make me feel like the algorithms don’t really understand me at all.
In the midst of research for my upcoming book Filterworld, I came upon a 2018 study that coined the term “algorithmic anxiety,” which immediately felt like the perfect label for my feelings of alienation. I could never tell why or how a particular algorithmic recommendation was working, and I was left second-guessing its effects, wondering if it was really giving me the best answer. “Algorithmic anxiety” was used by the academic researcher Shagun Jhaver to describe the way that Airbnb hosts tried to manipulate Airbnb’s search algorithm to give their listings higher rankings — they were stressed out because they didn’t really know how they were being evaluated.
But we consumers of content have algorithmic anxiety, too. We don’t know why we’re recommended a specific thing, or why our feeds are suddenly filled with one homogenous theme or another — all food videos, all crypto tweets. My latest essay for The New Yorker is “The Age of Algorithmic Anxiety,” an exploration of this idea and our fraught relationships with automated digital feeds.
The other week, I sent out a request to this newsletter list to participate in algorithms survey, to tell me your stories of weird recommendations and feed glitches. You can still take the survey here — I got over 125 great responses and I really appreciate everyone who took the time to participate. I also included some of the responses in my New Yorker essay. Here are some quotes that stuck out to me as being redolent of algorithmic anxiety:
Answering a question about “odd run-ins” with automated recommendations, one user reported that, after he became single, Instagram began recommending the accounts of models, and another had been mystified to see the Soundgarden song “Black Hole Sun” pop up on every platform at once. Many complained that algorithmic recommendations seemed to crudely simplify their tastes, offering “worse versions of things I like that have certain superficial similarities,” as one person put it. All but five answered “yes” to the question, “Has ‘the algorithm,’ or algorithmic feeds, taken up more of your online experience over the years?” One wrote that the problem had become so pervasive that they’d “stopped caring,” but only because they “didn’t want to live with anxiety.”
Anyone who uses social media long enough has anecdotes like this, when they were misperceived or misled by a machine. We’re constantly contending with our algorithmic shadows: the masses of data that we produce about our identities and preferences, which in turn shape what we get shown on our feeds, which are the principal way we consume so much media and culture these days. Sometimes the shadow feels correct, sometimes it doesn’t. To torture another metaphor, algorithmic recommendations are like a warped mirror. We can see ourselves, but only as filtered through the fundamental structure of the platform itself, which isn’t a one-to-one reflection.
Shagun Jhaver wasn’t the first person to come up with “algorithmic anxiety,” though he did define it in the clearest way. The AI scholar Kate Crawford used the phrase as early as 2013, and another academic named Patricia de Vries used it to title her blog in 2016. We’ve been feeling this anxiety for quite a while now. For de Vries in particular, “the algorithm” was less a specific piece of technology than an idea that we human users have also built up in our own heads. “When you focus so much on what algorithms can do or cannot do, that tends to obscure all the forces around it that make it happen,” she told me. Sometimes we fall into an “overdetermination of the algorithmic”: We assign it too much power, too much authority. In other words, our anxiety is partly our own fault.
As I was writing this essay, the draft title in my head was “The Algorithm as Metaphor,” which maybe I’ll write out as a separate piece. Of course, it’s a riff on Susan Sontag’s great 1978 book-length essay Illness as Metaphor. Sontag’s subjects were tuberculosis and cancer, which she was suffering from herself at the time. The diseases had become “encumbered by the trappings of metaphor,” she wrote. They were turned into narratives and mythologies: Tuberculosis gave people sexual charisma. Cancer shouldn’t be spoken of, otherwise it would kill faster. Having cancer became “fighting” a “war.” Yet, Sontag pointed out, these were in reality simple matters of biology and chemistry, for scientific medicine to resolve. The metaphors got in the way of those suffering from them.
Similarly, “the algorithm” is both a real technological device — feeding you content that’s like other content you like — and an overwhelming narrative, the sense that digital platforms are subsuming our selves and warping our desires, that we have no way of reclaiming that agency if Facebook doesn’t let us. I’m not so pessimistic. My writing is something of an attempt to shake loose the hold that “the algorithm” has on our psyches, explaining it in order to reduce it to something more mundane.
Once again, here’s the New Yorker essay, “The Age of Algorithmic Anxiety.”
It feels revolutionary to me that Twitter allows me to consume what I follow in chronological order. Sometimes it is more boring that the algorithmic Home feed but that’s on me!
But now we asked (repeatedly!) if we mind being catered to.