The Great Averaging
How the internet and algorithms are compressing culture—and how we fix it.
A generation of us grew up on the internet.
It globalized us. It exposed us to cultures we would never have encountered otherwise. It created pathways for upward mobility, democratized information, and lowered the barrier to entry for creativity.
But something subtle happened in the process.
While we gained access to everything, we slowly began losing something internal. Not access to culture, but a feel for it. The kind of instinct that tells you what’s yours and what you’ve merely consumed.
The internet did not just connect cultures. It compressed them.
This is not a moral accusation. It is an observation about systems. Monoculture has many origins depending on your ideological lens. Some blame corporate consolidation. Some blame globalization. Some blame market convergence. All of those forces matter, and writers like Katie Jagielnick have examined the broader economic and ecological dimensions of homogenization with real clarity. But this essay focuses on something more specific: the hive-mind effect produced by algorithmic systems, and how AI is quietly making it worse.
The Overlooked Long Tail
The early internet promised the “long tail.” Chris Anderson popularized the idea that once distribution costs approached zero, niche content would thrive. Obscure art, rare perspectives, unconventional creators would all finally find their audiences.
In theory, we should be living in the most culturally diverse moment in history.
In practice, attention became the bottleneck.
I think about this a lot. Production is cheap now. A song costs nothing to distribute. A blog post reaches the world instantly. But attention is finite, and the systems that allocate it do not optimize for cultural richness. They optimize for retention.
Recommendation systems are engineered around engagement metrics: watch time, click-through rate, session duration. They cluster users into behavioral cohorts and reinforce prior preferences through feedback loops. What gets rewarded is what performs. What performs is what looks like what already performed.
They reward hyperfixation. Sensationalism. Performative certainty. Emotional provocation.
Creators who align with these incentives scale. Others adapt or disappear. Over time, replication becomes safer than originality. And I’ve felt this pull myself. When I wrote about media consumption last year, I noticed how much of my own intellectual diet had been shaped by what the feed decided to show me, not by what I had actively sought out.
The result is not diversity. It is convergence.
The Algorithmic Narrowing of Taste
Here is what I’ve observed, and what I suspect you’ve noticed too.
The formats start to rhyme. The pacing starts to match. The aesthetics blur together. The discourse patterns repeat. Scroll through any platform long enough and you’ll feel it: a low hum of sameness underneath the apparent variety.
The viewer consumes within narrowing bands. The creator produces within those same bands. Both adapt to what performs, and performance is measured by a system that doesn’t care about originality. It cares about whether you stayed.
Brainrot and internet trends create shared cultural touchpoints. They increase participation. But they also systematically divert attention from slower, riskier, stranger work that doesn’t fit the algorithm’s tempo. The brave artist who refuses to conform to populism, who can’t keep pace with algorithmic fashion, gets buried. Not because the work is bad, but because deviation is penalized by distribution.
This shows up everywhere:
Recent books sound structurally identical, as if run through the same narrative template. Films are engineered for proven emotional pacing rather than genuine risk. Most social media posts are visually interchangeable. Modern software interfaces have converged toward the same flat, minimal design language. Everyone aspires to a suspiciously similar set of life goals. Viewership concentrates in a handful of creators while a long tail of genuinely interesting work goes entirely unnoticed.
Eventually, individuality thins. Taste converges. Culture flattens.
Not because people lack creativity. Because the infrastructure penalizes it.
An important distinction: compression affects distribution, not creation. There is more music, more writing, more visual art being made right now than at any point in human history. The algorithm doesn’t prevent interesting work from being made. It prevents it from being found. That’s a different problem, but it’s not a smaller one.
State of Tech
I work in this space, so let me be specific.
In 2025, AI captured roughly half of all global venture funding. According to Crunchbase, $211 billion flowed into AI-related companies, up 85% year over year. OpenAI and Anthropic alone accounted for 14% of all global venture investment. A third of all startup funding went to just 68 companies that raised rounds of $500 million or more.
This isn’t a broad-based innovation boom. This is capital clustering with extraordinary density.
The Stanford AI Index Report noted that by the end of 2024, the performance gap between the top-ranked and tenth-ranked model on the Chatbot Arena Leaderboard had narrowed to just 5.4%. MMLU scores, once a meaningful differentiator, jumped from 70% in 2022 to over 90% by 2025, to the point where the benchmark had to be partially retired and replaced with harder versions because frontier models had saturated it. Everyone is converging on the same scores, the same architectures, the same benchmarks.
And the downstream effects are visible. The interfaces look similar. The responses sound similar. The developer tools converge. The jargon spreads virally before its substance matures. “Agentic.” “Reasoning.” “Multimodal.” These words circulate like currency, traded more for social signal than for technical precision.
When most funding and attention flows into a single band of technological priorities, everything else starves. Venture investment in media, for instance, has grown just 2.6x since 2010, compared to AI’s 147x over the same period. Innovation doesn’t disappear, but it concentrates. And concentration, by definition, means the periphery goes dark.
The Age of the Dead Internet
There is an old conspiracy theory called the Dead Internet Theory. The original claim, that most online activity is generated by bots and the authentic web died years ago, was easy to dismiss when it first circulated. It is getting harder to dismiss now.
An Ahrefs study of 900,000 newly created web pages in April 2025 found that 74.2% contained AI-generated content. Separate analysis suggests that roughly 30 to 40% of text on active web pages now originates from AI-generated sources. About half of all internet traffic is non-human. At the level of raw content production, the web is becoming predominantly synthetic.
What humans actually engage with is still mostly human. A Graphite study found that 86% of articles ranking in Google Search were still written by people. The dead internet is not yet the consumed internet. But the gap is closing, and the direction is clear.
This matters because of what it does to the data environment. A peer-reviewed study published in Nature in 2024 by Shumailov et al. demonstrated that models trained on their own outputs undergo what the authors termed “model collapse”: the tails of the original distribution disappear first, minority patterns erode, and outputs drift toward bland central tendencies.
Now, the statistical averaging critique has limits. Agents, tool use, retrieval systems, and human-in-the-loop workflows are all improving. The next generation of AI systems will not be as bluntly constrained by pretraining distributions as today’s models are. The problem is not that AI will forever produce averages. The problem is what happens to the data environment while we wait for those improvements to mature. Every day, the web fills with more synthetic content. Every day, the ratio of authentic human signal to machine-generated noise shifts. Even if future models get smarter about how they generate, they will be drawing from a well that is already being contaminated.
The Dead Internet Theory stopped being a conspiracy theory somewhere around 2024. It became a description.
How AI Centralizes the Lens
The internet corpus that AI models are trained on was already skewed. Most of the pre-LLM web was written by upper-middle-class urban Westerners, largely in English. The models, by default, echo those values, those rhythms, those assumptions. Not because anyone designed it that way, but because that’s what the data looked like.
After pretraining, models undergo alignment processes. Human annotators rank outputs based on helpfulness, safety, and tone. This shapes responses toward normative expectations defined by a narrow set of evaluators.
This is not an argument against alignment itself. Without it, these models would reproduce the internet’s worst impulses unchecked: toxicity, hallucination, manipulation. Alignment is necessary. The problem is not that it exists but that it is concentrated. When a small number of teams, trained at similar institutions, sharing similar priors about what constitutes safe and helpful output, make these decisions for billions of users, the result is a narrowing of acceptable expression that no single team intended but all of them collectively produce. The trade-off between safety and cultural variance is a genuine engineering problem. It deserves to be treated as one, not collapsed into a simple story about corporate control.
xAI positions itself as building a politically centrist Western AI. Anthropic, more safety-focused, tends to lean left. Other labs have bent their outputs with whatever political winds prevail. OpenAI has shifted its positioning multiple times depending on the cultural and regulatory moment. The result is not a single bias but a small menu of biases, each curated by a company with its own incentives, deployed to billions of users simultaneously.
Two layers of compression emerge: data-level convergence baked in from the internet, and alignment-level convergence imposed by institutional preference.
A small number of labs define the training pipelines, safety constraints, and alignment objectives. This centralizes cognitive infrastructure in a way that has no real historical precedent. We have never before had a situation where a handful of companies in one country shape the default reasoning patterns, tone, and boundaries of thought for hundreds of millions of people across the world.
That alone changes cultural power.
How We Fix It
We do not fix this by rejecting technology. I owe too much of my own intellectual growth to the internet to pretend otherwise. But we do need to reintroduce variance deliberately, because the systems we’ve built are not going to do it for us.
Telling individuals to try harder to find niche content while the algorithm actively buries it is like telling people to solve climate change by recycling more. It helps at the margins, but it doesn’t fix the engine. The fixes have to be structural.
Diversify AI infrastructure. Encourage pluralism in training corpora, alignment philosophies, and model governance. The more centralized the pipeline, the greater the compression risk. We need models trained on different data, aligned by different people, reflecting different assumptions about what “good” output looks like.
Design for exploration. Platforms can optimize for diversity of exposure rather than pure retention. Discovery can be weighted toward variance instead of repetition. This is a design choice, not a technical limitation, and it would change the texture of the internet overnight. The algorithm doesn’t have to be a compression engine. It is one because engagement metrics reward sameness, and no one with the power to change that has been sufficiently incentivized to do so.
Reward creative risk. Deviation has to be socially and economically survivable. Not every output should be optimized for scale. Not every creator should have to game an algorithm to eat. The structures that fund and distribute creative work need to actively protect space for things that are weird, slow, and commercially uncertain. This means grants, alternative platforms, funding models that don’t require virality as a precondition for sustainability. It means venture capital that isn’t exclusively chasing the same subfield of technology while everything else starves.
Protect the long tail. Support creators who are niche, slow, and unfashionable. Attention is a form of capital. Allocate it intentionally. The algorithm will always surface what is already popular. The infrastructure should make it possible to find what isn’t.
And then there is what you do to protect yourself from succumbing to algorithmic compression:
Consume outside the feed. Do not let algorithms fully determine your intellectual diet. Seek out ideas that do not optimize for virality. Read books that were not recommended by an algorithm. Talk to people whose worldview you can’t predict from their bio. As one recent essay argued by contrasting modern academics with figures like Xenophon, lived experience breadth matters. Historical distance protects against contemporary convergence. Half the most interesting things I’ve ever read, I found by accident. That’s not a bug. That’s the point.
These individual choices do not fix the system. But they keep your own thinking alive while the system is being rebuilt. And that matters, because the system is shaped by the people who use it. If enough people refuse to let their taste be flattened, the demand for variance becomes something the infrastructure eventually has to respond to.
The Authenticity Arbitrage
I want to end on something that might seem like it contradicts everything I’ve just written. It doesn’t, but it needs to be said carefully.
The compression is real. The data proves it. The algorithm flattens. The feed homogenizes. The average gets reinforced. Nothing I’m about to say changes any of that.
But compression has an unintended side effect.
When everything starts to look the same, sound the same, and feel the same, difference becomes conspicuous. Not because the system suddenly rewards originality. It doesn’t. The algorithm will keep burying it. But human beings are pattern-interrupt machines. We notice what breaks the pattern precisely because the pattern has become so uniform.
As more people online converge toward an algorithmically optimized persona, the same cadence, the same takes, the same aesthetic, the same flattened voice, the bar to stand out drops. Not because standing out got easier, but because the backdrop got more monotone. Any real color becomes impossible to miss against a wall of beige.
Every monoculture in history has eventually bred its counter-movement. Punk came out of corporate rock. Independent film came out of blockbuster fatigue. The interesting question is never whether resistance will emerge. It always does. The question is whether the infrastructure allows it to be found.
And that is where the real problem sits. The system will not self-correct. The algorithm will not start surfacing authentic voices out of aesthetic conscience. The venture capital will not redistribute itself toward weird, uncommercial ideas out of cultural responsibility. If authenticity is going to matter, it will be because people chose to seek it out, fund it, build platforms that don’t bury it, and refuse to let their own taste be determined by a recommendation engine.
We are entering a period where being genuinely yourself, having taste that isn’t algorithmically derived, holding opinions you arrived at through friction rather than feed, will become the rarest and most valuable form of cultural capital. The compression is creating scarcity, and scarcity creates value.
The long tail still goes dark. Most authentic creators still get buried. The infrastructure is still broken. But for the ones who break through, the signal has never been louder. Because the noise has never been more uniform.
The question is not whether authenticity will become valuable. It will. The question is whether enough people will choose it, and whether we’ll build systems that let it survive.
Cultural biodiversity is not aesthetic virtue signaling. It is structural resilience. Ecosystems without diversity collapse. Markets without competition stagnate. Cultures without variance lose their capacity for breakthrough.
Innovation does not emerge from statistical averages. It emerges from friction, from the edges, from the people and ideas that don’t fit neatly into a feed.
The internet compressed culture. AI is accelerating the compression. But the same pressure that flattens also reveals what refuses to be flattened.
The edges are still there. The question is whether we go looking for them.

