by Jeremy Grossman, PhD
First, in order to not mislead: this article is not about dating, not about finding love on TikTok, not about witnessing love or performing love or anything else related to the content and creators on the app itself. This article is, rather, about the love that TikTok demands of its users, which is not a romantic love, but is definitely the kind of emotional investment that crafts preferences and habits. In short, this article is about love of and for the algorithm: a feature that TikTok has, against all odds, absolutely succeeded in implementing. It is both a feat and a feature.
Algorithms have become perhaps the principle reference point in the contemporary debates over technology and ethics, not to mention the Turing-test-inflected crisis in education regarding students' potential use of tools like ChatGPT to write their term papers. The kinds of questions that surface in these debates tend to center on trust and decision making: do I trust a human or a machine to make good decisions? Which do I trust to come to a proper ethical stance about an issue? They center on the politics baked into the technics of machine learning, which result in machines coming to the wrong sorts of conclusions about, say, criminality and race, because of the way that the models privilege repetition and history in their pursuit of prediction. Or how they differentiate, or fail to, between faces of similarly raced people. They center on both the instrumental (how good are they?) and ethical (do we think it's good?) status of tracking and targeting on the internet, specifically in the context of advertising and marketing.
But they also center on the status of a feeling: do people like algorithms? The question is not a particularly useful one, because algorithmic technologies are at work in virtually every corridor of our everyday lives. If people even notice them, they are usually not particularly bothered by them: looking up recipes, exercise tracking and analysis, the GPS telling you the best way to get across town during rush hour. They're not only controversial, they're desirable, they are the feature itself.
But on social media, this is not always the case. One of the promises of social media, from its earliest manifestations, was that self-curation was at the absolute center of the experience. Whereas traditional content models (which were of course not called CONTENT at the time) were predefined for audiences, social media allowed one to populate their own experience with the kinds of people and things they actually wanted to see. And as algorithmic capabilities emerged with the proliferation of different kinds of content (alongside, of course, the proliferation of self-serve advertising), there also emerged a new motive structure for platforms: serving people the kind of content that keeps them on the app is tantamount to profiting on the app. And the presumption, which nobody can quite refute, if we're honest, is that using algorithmic means to determine which content does that was an important step towards creating a generation of social media addicts (and, not incidentally, budding YouTube-curated neo-Nazis).
Except, as we've seen most recently with Threads, people sort of like it when they can control their feeds, because people have entirely different motive structures than tech companies that own social media platforms and are under immense pressure to extract money from its users on behalf of advertisers. When Threads launched on July 5, 2023 to much fanfare and an explosion of activity, users clamored for a chronological feed consisting solely of accounts they follow. Part of this was because the first few days of Threads consisted mostly of brands overusing the word "bestie" and joking about social media managers under fire from their bosses for being too spicy on Threads, and the experience of the app was primarily one of forced exposure to this embarrassing spectacle. But part of the demand for a "Following" feed followed precedent: Twitter (sigh...X) includes both algorithmic and following feeds, and in January of 2023 it introduced the ability to make the following feed a default upon app launch. It's a feature these platforms know they must offer, even at the expense of wider adoption of the slot-machine content curation algorithms it works so hard to tell people they should prefer.
The puzzle to crack, then, was always how to make most people prefer the algorithm. Enter TikTok.
One of the most fascinating dimensions of TikTok's ascendance was the dual observation that a) users spend most of their time on the "for you page" (FYP) instead of their following feed, and b) they love that. There's an argument related strictly to form here, and it's almost certainly true at some level that exclusively short-form video content curated into a bare-bones, never-ending descent through clip after ever-loving clip produces a psychological architecture of repetition that is very basically pleasurable for many people. Freud knew it!
But the algorithmic curation that The Guardian calls TikTok's "secret sauce" inserts another dimension, that realm of so-called personalization that, in turns both slowly and rapidly, transforms a new user's FYP into what they perceive to be a singular habitus, a place specifically and especially crafted just for them. They come to feel that the algorithm knows them -- really knows them! -- and their interests, that it understands them and works dutifully to make sure each special video will delight them. People are not dumb, so they know this fantasy isn't true, and couldn't possibly be true for hundreds of millions of people simultaneously, but that little puncture isn't enough to destroy the sense of it. And the delight that comes from being shown videos, without having chosen them, that match right up with their desires, preferences, interests, and fixations is beyond enticing. It's addictive.
The other phenomenon created by this particular personalization arrangement is the blurring of lines between pre-existing desires and interests and ones that are themselves constituted by the repetitive exposure to content. The wet dream of algorithmic social feeds is that self-reinforced outcomes are both proof that content is popular and the direct mechanism by which to encourage popularity. Not only does this creates situations in which arbitrary content is being gamed directly by TikTok itself, it also gets to have its cake and eat it, too: the most widely circulated content gets high engagement, and this high engagement justifies the wide exposure. All of a sudden, I find myself inexplicably invested in metal detectors and permaculture gardens, which, while sort of adjacent to some of my actual interests, were definitely not preexisting concerns of mine, and may or may not be by next month. But the repetition feels satisfying nevertheless, and the presumptive reassurance by a computer that it knows what I like is somehow less concerning when the affect is the point.
And to me, that's the key here. Last year, I taught an upper-level summer course on social media in which we spent a few days mulling over some of questions at the heart of this article. Many of my students expressed that they took some amount of pleasure in the extent to which the algorithms seemed to know them and their interests, but, at a more conceptual level -- the technical dimensions of it, the ethics of it, the utility of it -- they just really didn't seem to care that much. It wasn't as if the core of this ambivalence was the question of privacy, although some of them expressed concerns about that. What they were most ambivalent about was the conceptual question of the relationship between personalization and social media consumption, at the very same time that they kind of enjoyed it.
So, from my perspective, what TikTok has managed to do that other platforms have not done as successfully is cleave apart the conceptual conversation about algorithms, tracking, and privacy from the affective enjoyment of the effects of those technologies. For some reason, whether related to form or marketing or history, knowledge of the unsavory dimensions of the TikTok algorithms (including the nation-wide obsession over Chinese data farming) does not provoke the kind of hem-hawing you usually see when people try to explain why they are continuing to do something they otherwise feel they should not be doing.
In other words, to paraphrase one of the great exploiters of unconscious enjoyments and their tropology, the biggest and darkest triumph of TikTok's rise to ubiquity was how they got people to learn to stop worrying and love the algorithm.