Protesting the Algorithm (& recent writing)
Some observations on the algorithm as metaphor and my new writing.
|Kyle Chayka||Aug 17, 2020||48|
Hi it’s me! I’ve been writing more about design for The New Yorker’s website. Here are two recent fun pieces that revolve around visual criticism:
The North American Maximalism of Drake and Gigi Hadid: On celebrity home-decorating and the 21st-century Gilded Age.
Visualizing Coronavirus with Andrew Cuomo’s Pandemic Poster: On the governor’s elaborate visual metaphor for New York’s pandemic journey.
Observation: Protesting the Algorithm
(What do you call it when something’s not quite an essay, not an article, and not a blog post? I guess it’s just random noticing.) This morning I woke up to this tweet, of protestors in the U.K. shouting “Fuck the algorithm,” in a much faster and peppier tone than my American accent would predict:
The problem being protested is a new algorithm that predicted the results of “A-level” exams for 18-year-old students, the results of which are used by U.K. universities to figure out which students to extend offers to. The system measures test results against schools’ historical records and tries to maintain similar results as years past, applying a curve of a sort.
Yet students are finding that the new system is handing out different scores than they expected. Some students are even having university offers revoked when the algorithm outputs a significantly lower score, more than their teachers would have predicted, after the adjustment. As The Verge reports:
35.6 percent of grades were adjusted down by a single grade, while 3.3 percent went down by two grades, and 0.2 went down by three. That means a total of almost 40 percent of results were downgraded.
One teacher found that 85% of her students got grades that were lower than they had been led to expect. Obviously it came as a shock, since students build their future on which school they get into. The algorithm was fundamentally biased: average students at historically better-performing schools (which were often the equivalent of private schools, or in more expensive geographical areas) saw less decline in grades overall. The government quickly revoked the entire thing
What’s particularly interesting to me here is the protest chant: “Fuck the algorithm.” (It’s getting taken up on Twitter already.) We assign responsibility for a lot of things to a nebulous entity called “the algorithm.” Sometimes people thank the algorithm for good song recommendations in the comments of YouTube videos; other times they blame the algorithm (Netflix’s, for example) for not showing them the TV shows they’re interested in.
It’s as if we’re praising or decrying some distant, omnipotent god for the randomness of human life: the algorithm can either help us or hurt us if it so chooses. But these cases aren’t about a single, unified algorithm; they’re a bunch of different systems created by disparate businesses and organizations designed to achieve different goals. “The algorithm” is an equation that makes decisions; the equation’s designers are the problem. When the British students are blaming “the algorithm,” they’re blaming the sloppiness of Boris Johnson’s government and a faulty, untested tool.
As the artist Matthew Plummer-Fernández points out, protests against algorithmically regulated labor have already broken out from gig-economy workers for apps like Deliveroo and Uber. In addition to “the algorithm,” those are protests against the bad labor laws that allowed the apps to get this far in the first place and the companies for their willingness to push the law to its limits. The U.K. episode is making headlines because it shows how it’s not just blue-collar workers who are experiencing the technological manipulation; it’s everyone up and down the economic spectrum.
When we talk about “the algorithm,” I think what we’re describing is our experience of digital automation at a scale that subsumes our lives. Experiences or processes that are “algorithmic” are not-human. They happen at such a speed and volume that each decision or recommendation could not possibly be evaluated by humans. Thus they’re often dysfunctional. “Fuck the algorithm” — it’s as good a commandment for the next decade of technology as any, as long as we remember that people are still ultimately responsible for that disastrous automation.
If you like this piece, please hit the heart button below! It helps me reach more readers. Email me any thoughts or things you’d like me to look into and subscribe here. Or:
— Follow me on Twitter
— Buy my book on minimalism, The Longing for Less
— Read more of my writing: kylechayka.com