You don't choose what you see. Systems choose for you, optimizing for engagement, not understanding. Culture is increasingly machine-curated.

Open any social media app. Scroll for thirty seconds. What you see (the posts, the videos, the articles) you didn't choose any of it. An algorithm did.
This sounds obvious. Everyone knows feeds are algorithmic now. But the implications run deeper than most people realize. We're not just talking about which posts appear first. We're talking about the systematic reshaping of culture itself by optimization systems that have no concept of cultural value.
The algorithm doesn't know what's good. It knows what gets engagement. These are very different things.
Think about how cultural artifacts used to spread. A song got popular because DJs played it, because friends recommended it, because it appeared in a movie. A book found readers through reviews, word of mouth, bookstore placement. These were messy, human-mediated processes with all kinds of biases and gatekeeping problems.
But they were also processes that involved human judgment at every step. Someone decided this song was worth playing. Someone thought this book deserved a prominent review. The gatekeepers had limitations and prejudices, but they were applying something like aesthetic or intellectual criteria.
Algorithmic distribution works differently. The selection pressure isn't "is this good?" or even "will people like this?" It's "does this maximize time-on-platform?" Content that generates engagement (clicks, comments, shares, watch time) gets amplified. Content that doesn't, disappears.
And engagement isn't the same as value. Outrage is engaging. Anxiety is engaging. Shallow controversy is engaging. The algorithm has no way to distinguish between attention captured by genuine quality and attention captured by manipulation of psychological vulnerabilities.
Here's the core problem: the algorithms optimizing our information environment weren't designed to optimize for cultural flourishing. They were designed to optimize for advertising revenue.
The intermediate target is engagement: time spent on platform, interactions generated. The assumption was that engagement would correlate with user satisfaction. Give people what they want, they'll stay longer, everyone wins.
But engagement and satisfaction can diverge wildly. People engage with things that make them angry. They watch videos that make them feel inadequate. They scroll through content that leaves them feeling worse than before. High engagement, low wellbeing. The algorithm sees success; the human experiences something else entirely.
This misalignment compounds over time. Content that generates engagement gets more distribution. Creators learn what works and produce more of it. The entire cultural ecosystem shifts toward engagement-optimized content, whether or not that content is actually good for anyone.
One underappreciated effect: algorithmic curation tends toward homogenization.
When human gatekeepers controlled cultural distribution, their individual quirks created diversity. A radio DJ with unusual taste could break a weird song. An editor with a specific vision could publish something that didn't fit templates. The inefficiency of human curation created space for the unexpected.
Algorithms are more efficient. They find patterns. They learn that certain thumbnail styles get clicks, certain video lengths retain viewers, certain emotional registers generate shares. And then they reward content that matches those patterns.
The result is convergence. Thumbnails start looking the same across platforms. Video essays adopt identical pacing and structure. Articles follow the same formulas. The algorithm finds what works and amplifies it until everything looks like everything else.
This isn't a conspiracy. It's just optimization. But the cultural effect is impoverishment. Diversity requires inefficiency, room for things that don't fit the patterns to survive anyway. Algorithmic efficiency squeezes that room out.
Culture has always been shaped by its distribution mechanisms. The novel emerged alongside print technology. Rock and roll developed with radio. Television created its own cultural forms.
But previous technologies were relatively static. The printing press didn't learn from reader behavior and adjust what got published. Radio couldn't track which moments made listeners change the station and optimize accordingly.
Algorithmic distribution creates feedback loops that previous media couldn't. The system observes what gets engagement, promotes more of it, which shapes what creators make, which changes what gets engagement, which shifts the algorithm's understanding of what works. The loop runs continuously, adjusting in real-time.
Cultural production increasingly happens inside this loop. Creators don't just make things and hope audiences find them. They study analytics. They A/B test thumbnails. They optimize titles for discoverability. They're not responding to audience taste; they're responding to algorithmic interpretation of audience behavior.
The algorithm becomes a collaborator in creative work. An invisible one that most creators can't fully understand, but whose preferences they must accommodate to reach audiences at all.
Follow the incentives. Algorithmic curation benefits platforms, obviously, since it maximizes the engagement metrics that drive advertising revenue. But it also benefits certain kinds of creators.
If you're good at understanding what the algorithm wants, if you can reverse-engineer the optimization targets and produce content that hits them, you can build enormous audiences. The system rewards those who learn to play it.
This creates a new kind of cultural elite. Not tastemakers in the traditional sense, but algorithm-whisperers. People whose skill isn't making great work but making work that performs well in recommendation systems. Sometimes these overlap. Often they don't.
Meanwhile, creators who won't or can't optimize for the algorithm struggle to find audiences. The poet who just writes poems. The musician who makes music that doesn't fit content-length preferences. The thinker whose ideas don't generate engagement. The algorithm doesn't actively suppress them; it just doesn't amplify them, which in attention-economy terms amounts to the same thing.
We've built a system where culture is mediated by attention markets, and the market-makers are algorithms optimizing for engagement.
This matters because culture shapes how we think, what we value, how we understand ourselves and others. It's not just entertainment. It's the symbolic environment we live in. When that environment is systematically shaped by engagement optimization, the effects ripple outward into everything.
What kinds of ideas spread easily in this environment? Simple ones. Emotional ones. Ones that trigger immediate reaction rather than slow reflection. The algorithm has no patience for complexity. It measures response in seconds, not days or years.
And we absorb this. We internalize the rhythm of algorithmic culture: quick takes, hot reactions, endless novelty. The capacity for sustained attention atrophies when the environment never rewards it.
None of this is inevitable. Algorithms could optimize for different things. Platforms could use different metrics. Distribution systems could be designed around different values.
But this would require treating culture as something other than a market, or at least, as a market where the goods traded aren't just attention. It would require accepting less engagement, less growth, less profit. It would require someone deciding that cultural flourishing matters more than quarterly returns.
Who would make that decision? Not shareholders. Not advertising-dependent platforms. The incentives point the other way.
So the most likely path forward is more of the same. More sophisticated optimization. More engagement-maximized content. More cultural production shaped by algorithmic feedback loops. The machines getting better at capturing attention, without ever asking whether what captures attention is worth attending to.
We're in the early decades of a fundamental shift in how culture gets made and distributed. The printing press reshaped culture over centuries. We're maybe twenty years into the algorithmic era.
The full effects aren't visible yet. They're still emerging, still compounding, still working their way through cultural systems that evolved for different conditions.
What's clear is that something has changed. The curation of culture (what gets made, what gets seen, what becomes part of shared experience) is increasingly automated. And the automation serves purposes that have nothing to do with cultural value.
We're all living in a machine-curated environment now. Most of us don't notice because there's nothing to compare it to. This is just how things are. Open the app, see what the algorithm shows you, respond as the system expects.
The machine isn't evil. It's just optimizing. And we're what it optimizes on.
Join my newsletter to get notified when I publish new articles on AI, technology, and philosophy. I share in-depth insights, practical tutorials, and thought-provoking ideas.
Technical tutorials and detailed guides
The latest in AI and tech
Get notified when I publish new articles. Unsubscribe anytime.