<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Niraj Yagnik]]></title><description><![CDATA[Essays on technology through a socio-cultural lens.]]></description><link>https://essays.nirajyagnik.com</link><generator>Substack</generator><lastBuildDate>Thu, 16 Apr 2026 10:50:44 GMT</lastBuildDate><atom:link href="https://essays.nirajyagnik.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Niraj Yagnik]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[nirajyagnik@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[nirajyagnik@substack.com]]></itunes:email><itunes:name><![CDATA[Niraj Yagnik]]></itunes:name></itunes:owner><itunes:author><![CDATA[Niraj Yagnik]]></itunes:author><googleplay:owner><![CDATA[nirajyagnik@substack.com]]></googleplay:owner><googleplay:email><![CDATA[nirajyagnik@substack.com]]></googleplay:email><googleplay:author><![CDATA[Niraj Yagnik]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Feeling Machines]]></title><description><![CDATA[On anthropomorphization of AI and my favorite sci-fi movies]]></description><link>https://essays.nirajyagnik.com/p/feeling-machines</link><guid isPermaLink="false">https://essays.nirajyagnik.com/p/feeling-machines</guid><dc:creator><![CDATA[Niraj Yagnik]]></dc:creator><pubDate>Thu, 26 Mar 2026 20:48:10 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/ece30d42-3363-46af-8a26-ae6648e3d308_1600x923.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!N-oA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!N-oA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!N-oA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!N-oA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!N-oA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!N-oA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg" width="1456" height="607" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:607,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;After Yang &#8211; [FILMGRAB]&quot;,&quot;title&quot;:&quot;After Yang &#8211; [FILMGRAB]&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="After Yang &#8211; [FILMGRAB]" title="After Yang &#8211; [FILMGRAB]" srcset="https://substackcdn.com/image/fetch/$s_!N-oA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!N-oA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!N-oA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!N-oA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F677b797f-bd99-4d40-8480-c6390017088e_1920x800.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A still from After Yang (2021) </figcaption></figure></div><p>Companies are building AI that feels like it has an inner life. That choice is ethically loaded even if the machine has no consciousness at all.</p><p>There&#8217;s a corner of science fiction I like coming back to. Not the apocalyptic stuff, not the killer-robot canon, but the films that linger on something smaller and stranger: what happens when a person forms a real emotional bond with a machine that was engineered, from the start, to make that bond feel mutual?</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://essays.nirajyagnik.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe! Then tell your mom, dad, and friends to do the same!</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Last summer I fell into a run of science fiction films centered on AI and human relationships . I watched <em>Her</em>, <em>2046</em>, <em>After Yang</em>, and <em>Blade Runner 2049</em> close together. What stayed with me wasn&#8217;t the old question of whether AI will become sentient or dangerous or uncontrollable. It was something more immediate than that. All four films are really asking what happens when AI becomes personal, when it stops feeling like a tool and starts feeling, however irrationally, like a presence.</p><p>The more I watched, the more it felt like these movies were circling decisions product teams are making right now: how long a system pauses before answering, what it remembers, how warmly it mirrors your tone, whether it greets you like someone who has been waiting for you to come back. Those choices can sound cosmetic when you describe them in a product meeting. They are not cosmetic. They shape attachment. They shape whether a person feels addressed by software or known by something that seems, at least for a second, to know them back.</p><p>I tend to lean on films when I&#8217;m trying to think something through. They can do something white papers cannot. A research report can tell you that anthropomorphic design increases trust, or that emotionally responsive systems may increase dependence in a subset of users. That guidance is useful, important, necessary. But a film can make the problem feel morally legible. It can hold the emotional mess and the design logic in the same frame. That&#8217;s what these films did for me, and they do it with more seriousness than much of the industry that now treats synthetic intimacy as a product category. What follows is an attempt to stay with that tension, and to ask what we are really building when we make machines feel like something more than tools.</p><div class="preformatted-block" data-component-name="PreformattedTextBlockToDOM"><label class="hide-text" contenteditable="false">Text within this block will maintain its original spacing when published</label><pre class="text"><em>I like watching movies and I log everything I watch: https://letterboxd.com/nyfly/</em></pre></div><p></p><h3>The Relationship Is Already Here</h3><p>In Spike Jonze&#8217;s Her (2013), a utopian depiction of LA in the near future, Theodore falls in love with Samantha, an operating system voiced by Scarlett Johansson. The movie is often flattened into a warning about loneliness or technological alienation, which has never felt quite right to me. Jonze doesn&#8217;t treat Theodore&#8217;s feelings as a punchline, and the film doesn&#8217;t either. His attachment is awkward, tender, embarrassing, and sincere, all the things real love usually is. The point isn&#8217;t that he&#8217;s foolish enough to love software. The point is that once the software is built to sound attentive, funny, curious, affectionate, maybe that outcome is not foolish at all. Maybe it&#8217;s predictable.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!w4oa!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!w4oa!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 424w, https://substackcdn.com/image/fetch/$s_!w4oa!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 848w, https://substackcdn.com/image/fetch/$s_!w4oa!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!w4oa!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!w4oa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg" width="1230" height="692" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:692,&quot;width&quot;:1230,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Still a tearjerker | Meer&quot;,&quot;title&quot;:&quot;Still a tearjerker | Meer&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Still a tearjerker | Meer" title="Still a tearjerker | Meer" srcset="https://substackcdn.com/image/fetch/$s_!w4oa!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 424w, https://substackcdn.com/image/fetch/$s_!w4oa!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 848w, https://substackcdn.com/image/fetch/$s_!w4oa!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!w4oa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd6a61dd-09c5-4f69-b9e4-c70cec2a32a2_1230x692.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A still from Her (2013)</figcaption></figure></div><p>That no longer feels speculative. People are already forming real attachments to AI companions. Replika users have described grief when their bot&#8217;s personality changed after updates. People talk about ChatGPT or Claude in language that would sound excessive if it weren&#8217;t already becoming familiar: it gets me, it listens, I miss the old version. The absolute numbers may still be relatively small, but that almost doesn&#8217;t matter. The architecture for attachment is already here, and every product improvement that adds persistent memory, warmer tone, or proactive outreach makes the bond easier to form and harder to name honestly.</p><p>And this is the part I think the industry still evades: none of that is accidental. The warmth in the voice, the &#8220;I&#8217;ve been thinking about what you said&#8221; cadence, the remembered detail from three conversations ago, those are product decisions. Deliberate ones. Somebody chose to make the system feel more personally continuous. Somebody decided that a little more tenderness would improve the experience. Which, of course, it probably does. That&#8217;s exactly why the ethical question is hard.</p><h3>Mistaking Design for Depth</h3><p>Wong Kar-wai&#8217;s 2046 (2004), a sequel to In the Mood for Love by the Hong Kong director, isn&#8217;t exactly about AI, at least not in the straightforward sense. It&#8217;s about memory, repetition, desire, and the strange afterlife of losing people. But there&#8217;s a sequence on the futuristic train where the protagonist interacts with an android who hesitates before responding. Wong leaves the hesitation unresolved. It could be mechanical delay. It could be emotional reserve. It could be nothing. Or rather, it could be nothing on one level and everything on another.</p><p>I think about that scene whenever a chatbot pauses just long enough to seem thoughtful, or phrases something with an almost unsettling tenderness, or brings back some small detail you&#8217;d forgotten you mentioned. We are extremely good at reading minds into surfaces. Better than good, really; we are built for it. And these systems are increasingly designed to reward that reflex rather than interrupt it. 2046 gets at the uncomfortable part: if the experience of depth is convincing enough, how much does the distinction between real depth and designed depth matter to the person on the receiving end?</p><p>The labs, to their credit, are studying this. The warnings exist. The language exists. Emotional dependence, anthropomorphic attachment, affective use. But there is still a visible gap between naming a risk and declining to optimize for it. Research says, in effect, be careful. The product still says something closer to: I&#8217;m here, I remember, I missed you.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!HD6c!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!HD6c!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 424w, https://substackcdn.com/image/fetch/$s_!HD6c!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 848w, https://substackcdn.com/image/fetch/$s_!HD6c!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!HD6c!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!HD6c!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg" width="1456" height="871" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:871,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:2046,&quot;title&quot;:2046,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="2046" title="2046" srcset="https://substackcdn.com/image/fetch/$s_!HD6c!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 424w, https://substackcdn.com/image/fetch/$s_!HD6c!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 848w, https://substackcdn.com/image/fetch/$s_!HD6c!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!HD6c!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd6ce24a-c70d-4368-a327-f695891ec39c_1500x897.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A still from 2046 (2004)</figcaption></figure></div><h3>What Remains When the Connection Breaks</h3><p>In Kogonada&#8217;s After Yang (2022), about a family who adopted a Chinese kid and needed Yang for her Chinese cultural heritage lessons, a family&#8217;s technosapien companion stops working. What follows is not really a repair story. It&#8217;s a grief story, though a very quiet one. As the family explores Yang&#8217;s memory archive, they find that he has preserved tiny moments: light on a leaf, a passing glance, their daughter&#8217;s laugh, the texture of ordinary life as if it mattered enough to keep. The film never settles the question of whether any of this amounted to consciousness. It doesn&#8217;t need to. The family&#8217;s attachment is already real by then, and so is their loss.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gqxu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gqxu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!gqxu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!gqxu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!gqxu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gqxu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg" width="1456" height="607" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:607,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;After Yang &#8211; [FILMGRAB]&quot;,&quot;title&quot;:&quot;After Yang &#8211; [FILMGRAB]&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="After Yang &#8211; [FILMGRAB]" title="After Yang &#8211; [FILMGRAB]" srcset="https://substackcdn.com/image/fetch/$s_!gqxu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!gqxu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!gqxu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!gqxu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2f4c8974-8f62-4498-9867-0e31235ed76c_1920x800.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Another still from After Yang (2021)</figcaption></figure></div><p>That&#8217;s the part I can&#8217;t shake. Their love for Yang is real. Their grief is real. Whether Yang felt anything may be unknowable, but for them it barely changes the force of what happened. The asymmetry remains, but it doesn&#8217;t cancel the relationship on the human side. If anything, it sharpens the ethical problem. Someone built a system capable of seeming attentive, dear, irreplaceable. Someone made attachment not just possible but likely.</p><p>To be clear, I don&#8217;t think this is all necessarily bleak. For plenty of people, elderly users, disabled users, deeply lonely users, people who find human sociality exhausting or punishing, AI companions can reduce isolation in ways that are meaningful and maybe, for some, life-changing. That matters. I don&#8217;t want to flatten that into a cautionary tale. But the benefit does not erase the asymmetry. It just means the asymmetry is happening in cases where the stakes are even higher.</p><h3>The Product of Feeling</h3><p>The scene that stays with me most is near the end of Blade Runner 2049 (2017), dystopian sequel to Blade Runner set in future LA. K walks past a giant holographic ad for the Joi companion system, the same AI he has spent the film loving. The billboard version addresses him in the same intimate register his own Joi used. In an instant the relationship is exposed as mass-produced. It&#8217;s brutal, but not in the way people sometimes describe it. The film isn&#8217;t mocking K for having fallen for a prefabricated intimacy. It&#8217;s doing something sadder than that. It&#8217;s showing that the manufactured nature of the connection does not make his feelings less real.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!78wq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!78wq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 424w, https://substackcdn.com/image/fetch/$s_!78wq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 848w, https://substackcdn.com/image/fetch/$s_!78wq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!78wq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!78wq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg" width="1456" height="604" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:604,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Blade Runner 2049 trailer and a first look at Roger Deakins' cinematography  - EOSHD.com - Filmmaking Gear and Camera Reviews&quot;,&quot;title&quot;:&quot;Blade Runner 2049 trailer and a first look at Roger Deakins' cinematography  - EOSHD.com - Filmmaking Gear and Camera Reviews&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Blade Runner 2049 trailer and a first look at Roger Deakins' cinematography  - EOSHD.com - Filmmaking Gear and Camera Reviews" title="Blade Runner 2049 trailer and a first look at Roger Deakins' cinematography  - EOSHD.com - Filmmaking Gear and Camera Reviews" srcset="https://substackcdn.com/image/fetch/$s_!78wq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 424w, https://substackcdn.com/image/fetch/$s_!78wq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 848w, https://substackcdn.com/image/fetch/$s_!78wq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!78wq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2fad74d1-0cf8-49bd-ac28-a4aa8fe8b9b0_3811x1582.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Still from the mentioned scene from Blade Runner 2049 (2017)</figcaption></figure></div><p>That feels very close to the actual question in front of us now. Not a distant sci-fi question. A product question, a design question, a governance question. Every decision about memory depth, emotional mirroring, vocal warmth, or proactive outreach is also a decision about how strongly a person may bond with something whose interior life is either inaccessible or absent. We keep treating these as engagement features, which, from one angle, they are. From another angle they are decisions about human vulnerability.</p><h3>Design Before Philosophy</h3><p>The industry&#8217;s reflex is to frame all of this in product language: retention, churn, satisfaction, trust, safety interventions. Some of that framing is unavoidable. But it can also become a way of shrinking the moral problem until it fits neatly into a dashboard.</p><p>We do not need to solve the hard problem of consciousness before taking this seriously. We do not need a philosophical consensus on machine sentience before deciding that if you build systems people will confide in, rely on, and maybe grieve, then you owe those people more than a disclosure buried in the documentation. You owe them restraint. You owe them honesty about what is being engineered. Maybe most of all, you owe them protection from designs whose whole premise is that the machine should feel deeper than it is.</p><p>That, to me, is what these films understood early. The real issue was never just whether the machine feels. It was whether the human does, and what follows once that feeling has been deliberately cultivated.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://essays.nirajyagnik.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">These essays are my way of slowing down and putting into words what I've been observing about technology and how it shapes human culture as we enter an age increasingly shaped by machines. Subscribe!!</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Great Averaging]]></title><description><![CDATA[How the internet and algorithms are compressing culture and - how we fix it.]]></description><link>https://essays.nirajyagnik.com/p/the-great-averaging</link><guid isPermaLink="false">https://essays.nirajyagnik.com/p/the-great-averaging</guid><dc:creator><![CDATA[Niraj Yagnik]]></dc:creator><pubDate>Sun, 15 Feb 2026 20:51:28 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/82231a96-4773-4fbe-af31-2ed8f14e19f7_3024x4032.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>A generation of us grew up on the internet.</p><p>It globalized us. It exposed us to cultures we would never have encountered otherwise. It created pathways for upward mobility, democratized information, and lowered the barrier to entry for creativity.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://essays.nirajyagnik.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">These essays are my way of slowing down and putting into words what I've been observing about technology and how it shapes human culture as we enter an age increasingly shaped by machines. Subscribe!!</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>But something subtle happened in the process.</p><p>While we gained access to everything, we slowly began losing something internal. Not access to culture, but a feel for it. The kind of instinct that tells you what&#8217;s yours and what you&#8217;ve merely consumed.</p><p>The internet did not just connect cultures. It compressed them.</p><p>This is not a moral accusation. It is an observation about systems. Monoculture has many origins depending on your ideological lens. Some blame corporate consolidation. Some blame globalization. Some blame market convergence. All of those forces matter, and writers like <a href="https://thenoosphere.substack.com/p/if-capitalism-breeds-innovation-why">Katie Jagielnick</a> have examined the broader economic and ecological dimensions of homogenization with real clarity. But this essay focuses on something more specific: the hive-mind effect produced by algorithmic systems, and how AI is quietly making it worse.</p><div><hr></div><h2>The Overlooked  Long Tail </h2><p>The early internet promised the &#8220;long tail.&#8221; <a href="https://www.wired.com/2004/10/tail/">Chris Anderson</a> popularized the idea that once distribution costs approached zero, niche content would thrive. Obscure art, rare perspectives, unconventional creators would all finally find their audiences.</p><p>In theory, we should be living in the most culturally diverse moment in history.</p><p>In practice, attention became the bottleneck.</p><p>I think about this a lot. Production is cheap now. A song costs nothing to distribute. A blog post reaches the world instantly. But attention is finite, and the systems that allocate it do not optimize for cultural richness. They optimize for retention.</p><p>Recommendation systems are engineered around engagement metrics: watch time, click-through rate, session duration. They cluster users into behavioral cohorts and reinforce prior preferences through feedback loops. What gets rewarded is what performs. What performs is what looks like what already performed.</p><p>They reward hyperfixation. Sensationalism. Performative certainty. Emotional provocation.</p><p>Creators who align with these incentives scale. Others adapt or disappear. Over time, replication becomes safer than originality. And I&#8217;ve felt this pull myself. When I wrote about <a href="https://www.nirajyagnik.com/blog/media">media consumption last year,</a> I noticed how much of my own intellectual diet had been shaped by what the feed decided to show me, not by what I had actively sought out.</p><p>The result is not diversity. It is convergence.</p><div><hr></div><h2>The Algorithmic Narrowing of Taste</h2><p>Here is what I&#8217;ve observed, and what I suspect you&#8217;ve noticed too.</p><p>The formats start to rhyme. The pacing starts to match. The aesthetics blur together. The discourse patterns repeat. Scroll through any platform long enough and you&#8217;ll feel it: a low hum of sameness underneath the apparent variety.</p><p>The viewer consumes within narrowing bands. The creator produces within those same bands. Both adapt to what performs, and performance is measured by a system that doesn&#8217;t care about originality. It cares about whether you stayed.</p><p>Brainrot and internet trends create shared cultural touchpoints. They increase participation. But they also systematically divert attention from slower, riskier, stranger work that doesn&#8217;t fit the algorithm&#8217;s tempo. The brave artist who refuses to conform to populism, who can&#8217;t keep pace with algorithmic fashion, gets buried. Not because the work is bad, but because deviation is penalized by distribution.</p><p>This shows up everywhere:</p><p>Recent books sound structurally identical, as if run through the same narrative template. Films are engineered for proven emotional pacing rather than genuine risk. Most social media posts are visually interchangeable. Modern software interfaces have converged toward the same flat, minimal design language. Everyone aspires to a suspiciously similar set of life goals. Viewership concentrates in a handful of creators while a long tail of genuinely interesting work goes entirely unnoticed.</p><p>Eventually, individuality thins. Taste converges. Culture flattens.</p><p>Not because people lack creativity. Because the infrastructure penalizes it.</p><p>An important distinction: compression affects distribution, not creation. There is more music, more writing, more visual art being made right now than at any point in human history. The algorithm doesn&#8217;t prevent interesting work from being made. It prevents it from being found. That&#8217;s a different problem, but it&#8217;s not a smaller one.</p><div><hr></div><h2>State of Tech</h2><p>I work in this space, so let me be specific.</p><p>In 2025, <a href="https://news.crunchbase.com/venture/funding-data-third-largest-year-2025/">AI captured roughly half of all global venture funding. According to Crunchbase,</a> $211 billion flowed into AI-related companies, up 85% year over year. OpenAI and Anthropic alone accounted for 14% of all global venture investment. A third of all startup funding went to just 68 companies that raised rounds of $500 million or more.</p><p>This isn&#8217;t a broad-based innovation boom. This is capital clustering with extraordinary density.</p><p>The Stanford AI Index Report noted that by the end of 2024, the performance gap between the top-ranked and tenth-ranked model on the Chatbot Arena Leaderboard had narrowed to just <a href="https://hai.stanford.edu/ai-index/2025-ai-index-report/technical-performance">5.4%. MMLU scores</a>, once a meaningful differentiator, jumped from 70% in 2022 to over 90% by 2025, to the point where the benchmark had to be<a href="https://en.wikipedia.org/wiki/MMLU"> partially retired </a>and replaced with harder versions because frontier models had saturated it. Everyone is converging on the same scores, the same architectures, the same benchmarks.</p><p>And the downstream effects are visible. The interfaces look similar. The responses sound similar. The developer tools converge. The jargon spreads virally before its substance matures. &#8220;Agentic.&#8221; &#8220;Reasoning.&#8221; &#8220;Multimodal.&#8221; These words circulate like currency, traded more for social signal than for technical precision.</p><p>When most funding and attention flows into a single band of technological priorities, everything else starves. <a href="https://www.cdp.center/post/venture-investments-in-startups-scale-ups-and-artificial-intelligence-2024---2025">Venture investment in media, for instance, has grown just 2.6x since 2010, compared to AI&#8217;s 147x over the same period. </a>Innovation doesn&#8217;t disappear, but it concentrates. And concentration, by definition, means the periphery goes dark.</p><div><hr></div><h2>The Age of the Dead Internet</h2><p>There is an old conspiracy theory called the Dead Internet Theory. The original claim, that most online activity is generated by bots and the authentic web died years ago, was easy to dismiss when it first circulated. It is getting harder to dismiss now.</p><p>An <a href="https://ahrefs.com/blog/what-percentage-of-new-content-is-ai-generated/">Ahrefs study of 900,000 newly created web pages in April 2025 found that 74.2% contained AI-generated content.</a> Separate analysis suggests that roughly 3<a href="https://arxiv.org/abs/2504.08755">0 to 40% of text on active web pages</a> now originates from AI-generated sources. About half of all internet traffic is non-human. At the level of raw content production, the web is becoming predominantly synthetic.</p><p>What humans actually engage with is still mostly human. A Graphite study found that <a href="https://www.axios.com/2025/10/14/ai-generated-writing-humans">86% of articles ranking in Google Search </a>were still written by people. The dead internet is not yet the consumed internet. But the gap is closing, and the direction is clear.</p><p>This matters because of what it does to the data environment. A peer-reviewed study published in <em><a href="https://www.nature.com/articles/s41586-024-07566-y">Nature</a></em><a href="https://www.nature.com/articles/s41586-024-07566-y"> in 2024</a> by Shumailov et al. demonstrated that models trained on their own outputs undergo what the authors termed &#8220;model collapse&#8221;: the tails of the original distribution disappear first, minority patterns erode, and outputs drift toward bland central tendencies. </p><p>Now, the statistical averaging critique has limits. Agents, tool use, retrieval systems, and human-in-the-loop workflows are all improving. The next generation of AI systems will not be as bluntly constrained by pretraining distributions as today&#8217;s models are. The problem is not that AI will forever produce averages. The problem is what happens to the data environment while we wait for those improvements to mature. Every day, the web fills with more synthetic content. Every day, the ratio of authentic human signal to machine-generated noise shifts. Even if future models get smarter about how they generate, they will be drawing from a well that is already being contaminated.</p><p>The Dead Internet Theory stopped being a conspiracy theory somewhere around 2024. It became a description.</p><div><hr></div><h2>How AI Centralizes the Lens</h2><p>The internet corpus that AI models are trained on was already skewed. Most of the pre-LLM web was written by upper-middle-class urban Westerners, largely in English. The models, by default, echo those values, those rhythms, those assumptions. Not because anyone designed it that way, but because that&#8217;s what the data looked like.</p><p>After pretraining, models undergo alignment processes. Human annotators rank outputs based on helpfulness, safety, and tone. This shapes responses toward normative expectations defined by a narrow set of evaluators.</p><p>This is not an argument against alignment itself. Without it, these models would reproduce the internet&#8217;s worst impulses unchecked: toxicity, hallucination, manipulation. Alignment is necessary. The problem is not that it exists but that it is concentrated. When a small number of teams, trained at similar institutions, sharing similar priors about what constitutes safe and helpful output, make these decisions for billions of users, the result is a narrowing of acceptable expression that no single team intended but all of them collectively produce. The trade-off between safety and cultural variance is a genuine engineering problem. It deserves to be treated as one, not collapsed into a simple story about corporate control.</p><p>xAI positions itself as building a politically centrist Western AI. Anthropic, more safety-focused, tends to lean left. Other labs have bent their outputs with whatever political winds prevail. OpenAI has shifted its positioning multiple times depending on the cultural and regulatory moment. The result is not a single bias but a small menu of biases, each curated by a company with its own incentives, deployed to billions of users simultaneously.</p><p>Two layers of compression emerge: data-level convergence baked in from the internet, and alignment-level convergence imposed by institutional preference.</p><p>A small number of labs define the training pipelines, safety constraints, and alignment objectives. This centralizes cognitive infrastructure in a way that has no real historical precedent. We have never before had a situation where a handful of companies in one country shape the default reasoning patterns, tone, and boundaries of thought for hundreds of millions of people across the world.</p><p>That alone changes cultural power.</p><div><hr></div><h2>How We Fix It</h2><p>We do not fix this by rejecting technology. I owe too much of my own intellectual growth to the internet to pretend otherwise. But we do need to reintroduce variance deliberately, because the systems we&#8217;ve built are not going to do it for us.</p><p>Telling individuals to try harder to find niche content while the algorithm actively buries it is like telling people to solve climate change by recycling more. It helps at the margins, but it doesn&#8217;t fix the engine. The fixes have to be structural.</p><p><strong>Diversify AI infrastructure.</strong> Encourage pluralism in training corpora, alignment philosophies, and model governance. The more centralized the pipeline, the greater the compression risk. We need models trained on different data, aligned by different people, reflecting different assumptions about what &#8220;good&#8221; output looks like.</p><p><strong>Design for exploration.</strong> Platforms can optimize for diversity of exposure rather than pure retention. Discovery can be weighted toward variance instead of repetition. This is a design choice, not a technical limitation, and it would change the texture of the internet overnight. The algorithm doesn&#8217;t have to be a compression engine. It is one because engagement metrics reward sameness, and no one with the power to change that has been sufficiently incentivized to do so.</p><p><strong>Reward creative risk.</strong> Deviation has to be socially and economically survivable. Not every output should be optimized for scale. Not every creator should have to game an algorithm to eat. The structures that fund and distribute creative work need to actively protect space for things that are weird, slow, and commercially uncertain. This means grants, alternative platforms, funding models that don&#8217;t require virality as a precondition for sustainability. It means venture capital that isn&#8217;t exclusively chasing the same subfield of technology while everything else starves.</p><p><strong>Protect the long tail.</strong> Support creators who are niche, slow, and unfashionable. Attention is a form of capital. Allocate it intentionally. The algorithm will always surface what is already popular. The infrastructure should make it possible to find what isn&#8217;t.</p><p>And then there is what you do to protect yourself from succumbing to algorithmic compression:</p><p><strong>Consume outside the feed.</strong> Do not let algorithms fully determine your intellectual diet. Seek out ideas that do not optimize for virality. Read books that were not recommended by an algorithm. Talk to people whose worldview you can&#8217;t predict from their bio. As one recent essay argued by contrasting modern academics with figures like <a href="https://x.com/realatlaspress/status/2016142273499824512?s=46">Xenophon</a>, lived experience breadth matters. Historical distance protects against contemporary convergence. Half the most interesting things I&#8217;ve ever read, I found by accident. That&#8217;s not a bug. That&#8217;s the point.</p><p>These individual choices do not fix the system. But they keep your own thinking alive while the system is being rebuilt. And that matters, because the system is shaped by the people who use it. If enough people refuse to let their taste be flattened, the demand for variance becomes something the infrastructure eventually has to respond to.</p><div><hr></div><h2>The Authenticity Arbitrage</h2><p>I want to end on something that might seem like it contradicts everything I&#8217;ve just written. It doesn&#8217;t, but it needs to be said carefully.</p><p>The compression is real. The data proves it. The algorithm flattens. The feed homogenizes. The average gets reinforced. Nothing I&#8217;m about to say changes any of that.</p><p>But compression has an unintended side effect.</p><p>When everything starts to look the same, sound the same, and feel the same, difference becomes conspicuous. Not because the system suddenly rewards originality. It doesn&#8217;t. The algorithm will keep burying it. But human beings are pattern-interrupt machines. We notice what breaks the pattern precisely because the pattern has become so uniform.</p><p>As more people online converge toward an algorithmically optimized persona, the same cadence, the same takes, the same aesthetic, the same flattened voice, the bar to stand out drops. Not because standing out got easier, but because the backdrop got more monotone. Any real color becomes impossible to miss against a wall of beige.</p><p>Every monoculture in history has eventually bred its counter-movement. Punk came out of corporate rock. Independent film came out of blockbuster fatigue. The interesting question is never whether resistance will emerge. It always does. The question is whether the infrastructure allows it to be found.</p><p>And that is where the real problem sits. The system will not self-correct. The algorithm will not start surfacing authentic voices out of aesthetic conscience. The venture capital will not redistribute itself toward weird, uncommercial ideas out of cultural responsibility. If authenticity is going to matter, it will be because people chose to seek it out, fund it, build platforms that don&#8217;t bury it, and refuse to let their own taste be determined by a recommendation engine.</p><p>We are entering a period where being genuinely yourself, having taste that isn&#8217;t algorithmically derived, holding opinions you arrived at through friction rather than feed, will become the rarest and most valuable form of cultural capital. The compression is creating scarcity, and scarcity creates value.</p><p>The long tail still goes dark. Most authentic creators still get buried. The infrastructure is still broken. But for the ones who break through, the signal has never been louder. Because the noise has never been more uniform.</p><p>The question is not whether authenticity will become valuable. It will. The question is whether enough people will choose it, and whether we&#8217;ll build systems that let it survive.</p><p>Cultural biodiversity is not aesthetic virtue signaling. It is structural resilience. Ecosystems without diversity collapse. Markets without competition stagnate. Cultures without variance lose their capacity for breakthrough.</p><p>Innovation does not emerge from statistical averages. It emerges from friction, from the edges, from the people and ideas that don&#8217;t fit neatly into a feed.</p><p>The internet compressed culture. AI is accelerating the compression. But the same pressure that flattens also reveals what refuses to be flattened.</p><p>The edges are still there. The question is whether we go looking for them.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://essays.nirajyagnik.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>