The Death of Serendipity: What Algorithmic Personalization Is Doing to Your Mind

Leslie Poston:

Welcome back to PsyberSpace, the weekly podcast where we help you understand your world. I'm your host, Leslie Poston, and this week we're talking about something that used to be completely ordinary, stumbling into things you didn't go looking for. It used to be normal to pull a book off the wrong shelf at a used bookstore and end up carrying it around for years or catch a strange film at 1AM that you couldn't have searched for because you didn't yet have words for what it was? Almost all of us have heard a song on the radio and found it stuck with us in a way that something we deliberately queue up never quite does. That kind of accidental encounter has just been part of how people move through culture as part of ordinary life.

Leslie Poston:

Now most discovery is mediated by algorithmic systems built to predict what you'll probably like next. While that sounds efficient and can feel useful, something real is disappearing in the process, serendipity. Let's take a look at what that is and what it costs us. Serendipity gets treated as a kind of pleasant extra. Charming when it happens, but not essential.

Leslie Poston:

I would love to push back on that. Serendipity is one of the mechanisms through which we encounter what we couldn't have planned for, and that has real psychological value that we don't always credit. When you search for something, you're working from an existing map of your interests and following it forward. Serendipity works differently. It introduces material that lies outside your map entirely, things you wouldn't have known to search for because you didn't yet have the conceptual vocabulary to articulate them.

Leslie Poston:

Some of the most significant shifts in how people think and what they care about happen precisely because something arrived before they had a frame for it. If you'd want to look at it through the lens of social media and not your everyday life, look at TikTok as an example. Serendipity was its algorithm's secret sauce in the early days, and under new ownership, that serendipity is completely lost. Older media environments had more room for this, partly because they were messier. Browsing in a library or a record store without a search interface meant moving through adjacency, whatever happened to be shelved near what you came for.

Leslie Poston:

Magazines and newspapers built that adjacency into their structure on purpose so that something you cared about sat right next to something you'd never have gone looking for. None of that was ideal, and plenty of it was shaped by exclusion and gatekeeping, but it did create more opportunities for accidental encounters. Personalization increasingly organizes discovery around prediction instead, and prediction tends to be conservative by design. It works from what you've already done and treats the past as the best available guide to the future, even if that's wrong. The argument for serendipity isn't sentimental.

Leslie Poston:

There's a substantial body of research on what novelty does to the mind that goes back decades. Novel stimuli are intrinsically motivating. People seek them out not because they lead to an external reward, but because novelty itself functions that way. The brain doesn't just tolerate the unfamiliar. Under the right conditions, it actively craves it.

Leslie Poston:

More recent neuroscience has added to this picture by showing that novelty activates the locus coeruleus, a brain region involved in arousal and attention, and that this activation is associated with enhanced learning and memory consolidation. Uncertainty in moderate doses doesn't just make things more interesting, it makes them stick better. Berlin identified the inverted U relationship between complexity and curiosity. Things that are too familiar generate no curiosity at all, and things that are too alien generate anxiety rather than interest. And the most generative zone is somewhere in the middle, novel enough to engage your mind, but familiar enough to be tractable.

Leslie Poston:

That framework has held up well, and what it suggests is that environments engineered to minimize surprise may actually undercut the conditions under which people learn most effectively because they keep pulling experiences toward the too familiar end of the curve. There's a related mechanism called habituation. When the brain is repeatedly exposed to the same kind of stimulus, it stops responding with the same intensity. This is adaptive. You don't want to be equally startled by everything forever.

Leslie Poston:

But it also means that a media environment that consistently serves you mirror variations of what you already know may gradually reduce your capacity to be genuinely engaged by it. The stimulation is still there, but the response flattens. You're consuming without being activated in the ways that drive learning, memory, or the kind of interest that leads somewhere worth going. There's another well established psychological principle that seems to cut in the opposite direction: the mirror exposure effect. The mirror exposure effect describes a pattern that has been replicated extensively across cultures and stimuli.

Leslie Poston:

The more we're exposed to something, the more we tend to like it, even in the absence of any other information about it. So John showed this with nonsense words, geometric shapes, Chinese characters shown to people who couldn't read them, and photographs of strangers. The effect doesn't require conscious recognition. It's operating below awareness through a process called processing fluency, something we've talked about on the podcast before. When the brain encounters something familiar, it processes it more easily, and that ease of processing gets misattributed as positive feeling.

Leslie Poston:

We like what we know, partly because knowing it makes it effortless. This matters for how we understand what recommendation systems are doing. Platforms aren't just removing serendipity and predicting your preferences. They're actively shaping them through the mechanism that Zijonk identified. Every time a system serves you content similar to what you've engaged with before, it's not just responding to your taste.

Leslie Poston:

It's reinforcing and deepening a narrow taste through repeated exposure. So your apparent preference for a particular genre, aesthetic, or ideological frame isn't only a stable trait being accurately read by an algorithm. It's also in part a product of how often you've been shown that thing. The algorithm reflects the preference back. The reflection strengthens the preference, and this generates more signal for the algorithm, and so on.

Leslie Poston:

Mere exposure reaches its maximum effect within roughly 10 to 20 exposures. And research has found that at some point, further repetition can actually reduce liking. Familiar content becomes predictable, and predictable content eventually becomes boring. Platforms are threading a needle here, serving you enough familiarity to remove serendipity and activate processing fluency and the associated positive effects, but with enough variation to stay just clear of your boredom threshold. The result tends to be adjacent noveltythings that feel new but operate within the same basic parameters as what you already know.

Leslie Poston:

True serendipitous surprise, the kind that activates the novelty seeking system Berlin described, is a different thing, and it's harder to engineer into a retention optimized system. This narrowing is easy to miss because digital environments look abundant. Streaming platforms, social feeds, and recommendation engines seem full of options, and the sheer volume creates the impression of openness. Researchers in the recommender systems field have been wrestling with this tension for years. It's a well established problem that a system can be highly accurate at predicting what a user will engage with while simultaneously failing to expose them to anything genuinely outside their existing patterns.

Leslie Poston:

The field has developed what it's calling beyond accuracy metrics, specifically to measure serendipity, diversity, and novelty as distinct from mere relevance, because relevance alone turns out not to be the same thing as a good recommendation in any deeper sense. What tends to get served isn't true difference but adjacent similarity. You engage with this, so here are things that operate in the same genre, mood, aesthetic, or ideological neighborhood. It can create a feeling of exploration while producing something closer to refinement within an already established conservative range. Taste often develops through detour.

Leslie Poston:

People end up in generative places because one unexpected encounter led to another in a chain that no predictive system would have recognized as coherent. That serendipitous process is nonlinear and somewhat inefficient. And the inefficiency is part of what makes it generative. Personalization tends to reduce it not by eliminating discovery completely, but by consistently favoring the near neighbor over the true surprise. The things we repeatedly consume don't just entertain us.

Leslie Poston:

They participate in how we understand ourselves. Self-concept isn't a fixed thing you discover about yourself. It's constructed continuously through experience, social feedback, and the stories that you tell about patterns you're noticing in your own behavior. Cooley's concept of the looking glass self, or the idea that we come to understand ourselves partly through our perception of how we're reflected back by our social environment, is over a century old, but it maps onto the algorithmic situation with kind of uncomfortable precision. Recommendation systems are a kind of mirror.

Leslie Poston:

They reflect a version of you back at you, derived from your behavioral history on a platform, and they do so relentlessly and repetitively. Research on self-concept updating shows that we tend to preferentially integrate feedback that's congruent with our existing self view. We notice and absorb information that confirms what we already believe about ourselves and discount or don't register what doesn't fit. When an algorithm consistently reflects only one slice of your complex behavioral history The clicks, the watch time, the engagement signals it can measure, that reflected version can start to feel more central and stable than it actually is, with a passing interest hardening into an identity marker simply because the system keeps reinforcing it. Self perception theory adds another layer.

Leslie Poston:

Bem argued that people often infer their own attitudes by observing their own behavior, much the same way they might infer someone else's attitudes. If the algorithm keeps putting a particular kind of content in front of you and you keep engaging with it, you may come to think of yourself as someone who cares deeply about that thing, even if your original preference was mild or circumstantial. We can see this in the pipelines on the Internet. For example, the fitness to Inselmanosphere pipeline or the wellness to trad wife pipeline where these systems and the rewards that they bring shift people's identity in real time. They see what you clicked on, and they build from there.

Leslie Poston:

And over time, this modeled self and the actual self can drift without people noticing because the modeled version is the one that's getting constantly reinforced. Serendipity also has a dimension that extends beyond the individual. Discovery used to happen in more shared spaces, not because everyone had identical taste, but because common channels like broadcast media or physical spaces or just shared cultural moments created accidental overlap. You encountered things because they happened to be in circulation. They were played in public or appeared next to something else in an environment that wasn't curated to your individual behavioral profile.

Leslie Poston:

Researchers studying what is called exposure diversity have argued that this matters not only for individual experience, but for civic and cultural life in ways that are hard to replace. When personalization becomes the dominant model of discovery, people encounter fewer things by accident and have fewer things in common. Two people can use the same platform, live in the same city, and occupy almost entirely different informational and cultural realities, not just disagreeing on things they've both encountered, but lacking a shared vocabulary for any frame of reference that they can find common ground on at all. And that's a different kind of fragmentation than simple political polarization. It's an epistemological one, and the tools for navigating it are a little less obvious.

Leslie Poston:

Serendipity once helped produce the incidental overlaps that let us encounter difference in low stakes, ordinary ways. When those contact points diminish, social fragmentation loses one of its quieter counterweights. I want to be clear. I'm not arguing that people are simply passive victims of recommendation systems. I don't think the evidence supports that framing.

Leslie Poston:

People deliberately search for things. They ignore recommendations. They move unpredictably, and they use platforms in ways their designers didn't anticipate all the time. Individual agency doesn't disappear because an environment has a particular architecture. However, it does operate within that architecture, and that architecture is not neutral.

Leslie Poston:

The evidence on direct short term political persuasion effects from feed algorithms is actually more mixed than public discourse often implies. Studies have found more limited immediate effects than the scarier versions of the story suggest. The more convincing account is a little slower. The effects of habituation, gradual narrowing, and the normalization of familiarity as the default condition of discovery accumulate over time through repetition and convenience rather than through a dramatic intervention. This also makes them easier to dismiss.

Leslie Poston:

But research on habituation and research on mere exposure both point in the same direction. What you're repeatedly shown shapes what you come to prefer, and what you come to prefer shapes what you seek out. So over time, that can change the range of what feels available to you without any single moment that you can pinpoint to and say, That's when it happened. The reason personalization is so effective has less to do with manipulation than with the conditions we're all already living under. Most of us are overextended, time constrained, and making hundreds of small decisions every day.

Leslie Poston:

Decision fatigue is real, and in that context, a system that pre filters options and surfaces something plausibly relevant can feel like a genuine reduction in cognitive load, so it's not irrational to value that. The same mechanism that makes personalization feel helpful, processing fluency, the ease of engaging with the familiar, is also what makes it hard to notice when it's gradually narrowing your range. Because that narrowing doesn't feel like a loss, it starts to feel kind of like relevance. You're not aware of what you're not being shown because that's not how your attention works. You can only notice what's present, and then what's present keeps feeling more and more appropriate.

Leslie Poston:

A deeper problem is that exploration requires a kind of cognitive investment that optimized environments actively undercut. Following a line of interest that doesn't immediately promise payoff, tolerating the discomfort of not yet knowing something, or sitting with something unfamiliar long enough to develop a genuine response to it takes energy. And they're a little harder to sustain when your surrounding environment consistently offers a more frictionless alternative. Serendipity isn't just a feature of old media. It's a practice.

Leslie Poston:

And like most practices, it atrophies when the conditions that support it disappear. Now, I'm not going to argue for technology purity politics here. Most people are going to keep using platforms, and there's nothing particularly useful about pretending that we aren't. The practical question is how to preserve some space for the unexpected within systems that are increasingly organized against it. Some of this is structural.

Leslie Poston:

Turning off autoplay removes one of the most effective mechanisms for keeping you inside a loop. Browsing a physical library or a bookstore without a target in mind reintroduces real adjacency, that thing you didn't know you wanted sitting next to the thing you came for. Reading outside of your field or your interest or following a writer whose work takes you somewhere you didn't expect to go, or spending time with something you stumbled into rather than something you searched for are all ways of keeping the door open to the kinds of encounter that algorithms are structurally unlikely to deliver. It's also about asking different questions of your own preferences. The mere exposure effect, meaning that what you think you like might be partly a function of what you've been shown repeatedly, is not a reason to distrust all of your preferences.

Leslie Poston:

But it is a reason to hold them a little more lightly and to wonder occasionally what else might be out there. The self perception research available suggests that you're at least partially inferring your own tastes from your own behavior, and that means the behavior the platform generates in you is kind of feeding back into how you understand yourself. Asking whether a preference is genuinely yours or partly an artifact of what you've been consistently offered isn't a question platforms have any incentive to prompt you with, so you have to make it intentional. The goal isn't a friction maximized life or a complete rejection of digital tools. It's preserving the conditions under which your mind stays open to ideas it hasn't encountered yet and versions of itself it hasn't yet become.

Leslie Poston:

So what worries me a little about the loss of serendipity isn't simply that feeds become repetitive. It's that a world organized primarily around prediction might gradually teach us all to expect less. Less from discovery, less surprise, less contact with what we didn't know we wanted, less contact with each other. The psychological mechanisms of all of these techniques make this feel like comfort instead of the constriction that it is. A system built to give you more of what you already seem to like is certainly useful but limited.

Leslie Poston:

It can't imagine who you might become, so you have to imagine who you might become. Leave room for the version of yourself that hasn't shown up in the data yet. Thanks for listening to PsyberSpace. I'm your host, Leslie Poston, signing off. Until next time, stay curious, and don't forget to subscribe so you never miss an episode.

The Death of Serendipity: What Algorithmic Personalization Is Doing to Your Mind
Broadcast by