<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[tomorrow's mess]]></title><description><![CDATA[Field notes on emerging technology's impact on policy and society.]]></description><link>https://www.tomorrowsmess.com</link><generator>Substack</generator><lastBuildDate>Thu, 14 May 2026 01:18:40 GMT</lastBuildDate><atom:link href="https://www.tomorrowsmess.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Casey Mock]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[tomorrowsmess@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[tomorrowsmess@substack.com]]></itunes:email><itunes:name><![CDATA[Casey Mock]]></itunes:name></itunes:owner><itunes:author><![CDATA[Casey Mock]]></itunes:author><googleplay:owner><![CDATA[tomorrowsmess@substack.com]]></googleplay:owner><googleplay:email><![CDATA[tomorrowsmess@substack.com]]></googleplay:email><googleplay:author><![CDATA[Casey Mock]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Rationalist Cafeteria]]></title><description><![CDATA[What AI doomers, Scientologists, and Mean Girls have in common]]></description><link>https://www.tomorrowsmess.com/p/the-rationalist-cafeteria</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/the-rationalist-cafeteria</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Tue, 10 Mar 2026 20:33:17 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!XqUM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In the 2004 film <em>Mean Girls</em>, Lindsay Lohan&#8217;s character arrives at her new high school and is immediately initiated into the social cartography of the lunchroom &#8212; who sits where, who defers to whom, and most importantly, what happens to those who challenge the reigning hierarchy. The &#8220;Plastics,&#8221; led by Rachel McAdams&#8217;s Regina George, operate a social system with two enforcement mechanisms working in concert: the threat of exclusion from the in-group, and the constant, pointed derision of those already outside it. These mechanisms are not separable; the sneering at outsiders is what makes membership feel valuable, while the fear of being banished to join the losers is what keeps members in line.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!XqUM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!XqUM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XqUM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XqUM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XqUM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!XqUM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg" width="1456" height="975" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:975,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!XqUM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!XqUM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!XqUM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!XqUM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78a39278-7959-4ad0-9c71-80a52b7a575e_1493x1000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><em>Mean Girls (2004), Paramount Pictures</em></figcaption></figure></div><p>I have been thinking about <em>Mean Girls </em>a lot lately in the context of discourse about AI. <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Blood in the Machine&quot;,&quot;id&quot;:1744395,&quot;type&quot;:&quot;pub&quot;,&quot;url&quot;:&quot;https://open.substack.com/pub/bloodinthemachine&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e21f9bf3-26aa-47e8-b3df-cfb2404bdf37_256x256.png&quot;,&quot;uuid&quot;:&quot;6b38bc6a-a088-45e0-800c-45675b9e530f&quot;}" data-component-name="MentionToDOM"></span>'s <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Brian Merchant&quot;,&quot;id&quot;:934423,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cf40536c-5ef0-4d0a-b3a3-93c359d0742a_200x200.jpeg&quot;,&quot;uuid&quot;:&quot;a36eb69a-71fc-4b10-91ea-0c2c638bed40&quot;}" data-component-name="MentionToDOM"></span> did <a href="https://www.bloodinthemachine.com/p/five-takeaways-from-an-unhinged-ai">admirable work cataloguing</a> the state of affairs, from the viral "Something Big Is Happening" post that racked up tens of millions of views; Anthropic's coordinated press blitz timed to a $30 billion investment round; the straw-man allegation, repeated with remarkable persistence, that critics think AI is "fake." Merchant's takeaways are worth reading in full, but the one that&#8217;s worth bookmarking is his fifth: uncertainty still reigns. Nobody actually knows how this plays out.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Perhaps it&#8217;s because I studied the humanities as an undergraduate, but where I come from that sort of epistemic humility is usually a strength, not a weakness. You wouldn&#8217;t know that from the discourse, as not just Merchant has found but also <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Gary Marcus&quot;,&quot;id&quot;:14807526,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Ka51!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F8fb2e48c-be2a-4db7-b68c-90300f00fd1e_1668x1456.jpeg&quot;,&quot;uuid&quot;:&quot;6f850439-5102-4154-bccd-6c1f31875855&quot;}" data-component-name="MentionToDOM"></span>, <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Freddie deBoer&quot;,&quot;id&quot;:12666725,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qfu3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ef5ce9d-e16e-4119-8615-0aab3758277c_1402x983.jpeg&quot;,&quot;uuid&quot;:&quot;30f9d368-4e34-4eca-8122-2ed66b5b3337&quot;}" data-component-name="MentionToDOM"></span>, and others. Journalists, policy experts, academics, and critics who raise questions about AI's capabilities relative to the hype, its political economy, or whether the underlying economics of these companies and their investments pencil out are dismissed not on the merits but as the ravings of members of a benighted class who simply <em>don't get it</em>. To put it in Silicon Valley argot, critics like me have &#8220;bad sense-making.&#8221;</p><p>But the allegation that critics think AI is "fake" functions as a kind of <em>Mean Girls</em> social technology. It doesn't engage the actual critique as wrong or not; rather, it classifies the critic as the kind of person as being so wrong that they don&#8217;t need to be taken seriously. This is a Regina George move, signaling to friends that this person is not one of us. </p><p>But more importantly, reclassifying the critic as uninformed is a way of avoiding the evidentiary burden that those making big claims about AI should shoulder. </p><p>A belief that the emergence of a machine superintelligence is nigh and will fundamentally change or destroy human civilization is a classic Russell&#8217;s Teapot scenario. Bertrand Russell's illustration goes like this: if I claim there is a teapot orbiting the sun, too small to be detected by any telescope, you cannot prove me wrong &#8212; which means that the burden of proof is on me, not you, to prove that the teapot is there. In other words, if a claim is empirically unfalsifiable, the burden of proof rests squarely on the one making an empirically unfalsifiable claim. Those claiming we are on a runaway train toward developing a superintelligence that will either exterminate humanity or solve all our problems are making an empirically unfalsifiable claim &#8212; there is no data or evidence that exists or could exist rebutting it. </p><p>For me, folks who are &#8220;feeling the AGI&#8221; have not met the burden of proof. Forgive me if I don&#8217;t think that makes me dumb.</p><p>There is a much bigger market in the attention economy for confidence than there is humility, and the AI boomer and doomer brands &#8212; <a href="https://www.theatlantic.com/books/archive/2025/09/what-ais-doomers-and-utopians-have-in-common/684270/">superficially opposed, structurally identical</a> &#8212; have proven remarkably good at capturing it. Dario Amodei gets a <em>New Yorker</em> profile and Anthropic&#8217;s effective altruist philosopher gets a WSJ expos&#233;. Joe Rogan and Ezra Klein and Bill Maher ask earnest AI engineers questions about the end of the world. The catastrophe narrative and the utopia narrative are both simple enough to fit in a tweet, feel important, and are unfalsifiable enough that no evidence can dislodge them.</p><p>This is low-hanging fruit. It is much easier to get booked on Jon Stewart to augur the end of the world than it is to make a carefully hedged argument about the limitations of LLMs and the financial incentives of venture-backed AI companies. The former makes for engaging television, while the latter requires the audience to hold several things in tension simultaneously (which television &#8212; and most viral content &#8212; is not built to reward). To bring it back to Regina George and high school: it takes less skill to hit the puniest freshman in gym class in the head with a dodgeball than it does to hit a varsity athlete.</p><p>The media attention also must be genuinely seductive and corrupting. If your YouTube video and podcast episode are being shared by people with hundreds of thousands of followers, if you get invited to Davos, the social feedback most of us would take is that <em>I must be doing something right.</em> Accumulated status, in this discourse, thus does part of the epistemological work that evidence is supposed to do. When a well-networked figure who hypes AI or prophesies doom dismisses a critic, that figure&#8217;s social capital closes the argument that the evidence cannot.</p><p>Social capital plays another role, in that it provides a rallying point for the less well-networked true believers &#8212; the ones who may not be performing certainty for a live studio audience but have parasocial or even social relationships with those who do. For many of these folks, the unfalsifiable claim does not register as unfalsifiable at all; there&#8217;s a religiosity to it, and sharing or repeating the claims of one&#8217;s favorite doomer or utopian is like spreading a gospel. Here, the analogy is not to <em>Mean Girls</em> but to <em><a href="https://en.wikipedia.org/wiki/Going_Clear_(film)">Going Clear</a>.</em></p><p>Just as L. Ron Hubbard constructed an elaborate, internally consistent pseudoscience about human psychology that attracted intelligent, technically-minded true believers with a lot of social capital, the rationalist and effective altruist movements have constructed an elaborate, internally consistent pseudoscience about machine superintelligence that has attracted intelligent, technically-minded true believers with a lot of social capital. And like Scientology, their intelligence and social capital serve to defend the faith rather than explore hard questions about empirical evidence.</p><p>Scientology persists because its internal logic is, within its own premises, coherent; because membership confers real social rewards; and because the cost of questioning or the thought of abandoning it, once adopted, is very high. For adherents to rationalism and effective altruism who are concerned with existential risk from AI &#8212; and I know not all adherents to EA orient around AI risk, but only a subset &#8212; the movement has an analogous internal coherence. Once you internalize and believe the premise that a machine god threatens humanity&#8217;s survival, you see no need to prove the teapot is there to anyone; there just must be something wrong with those who don&#8217;t see it like you do. <em>Poor savages &#8212; if only they could see it.</em></p><p>What makes this particularly hard to name is that these folks are not, by disposition, what we would normally call bullies. They are not Nelson Muntz, laughing while they give you an atomic wedgie. They, by and large, present as gentle, earnest, thoughtful people who care about the future of humanity, may well be vegan, and know about bed nets and malaria. But when it comes to AI discourse, effective altruists and rationalists are part of an in-group that self-identifies in opposition to an out-group, just like Regina George &#8212; it&#8217;s just that their social cruelty is laundered through the vocabulary of concern.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!U4ls!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!U4ls!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 424w, https://substackcdn.com/image/fetch/$s_!U4ls!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 848w, https://substackcdn.com/image/fetch/$s_!U4ls!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!U4ls!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!U4ls!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg" width="886" height="509" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:509,&quot;width&quot;:886,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:130247,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!U4ls!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 424w, https://substackcdn.com/image/fetch/$s_!U4ls!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 848w, https://substackcdn.com/image/fetch/$s_!U4ls!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!U4ls!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93cd7cfd-75f3-4b0c-a9bb-de45e8f87ca8_886x509.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Nelson Muntz versus someone who presumably has &#8220;bad sense-making.&#8221; <em>The Simpsons.</em></figcaption></figure></div><p>The reality is that the AI engineering, AI safety, effective altruist, and rationalist communities in the Bay Area overlap and are remarkably tight-knit. They share conference circuits, Signal chats, Burning Man experiences, and often enough, polycules and living arrangements. When your professional and personal world is populated entirely by people who share a set of beliefs about existential risk, those beliefs become genuinely difficult to separate from facts. This is how cults work: the beliefs persist because the social world has been constructed so that questioning them means potentially losing everything &#8212; colleagues, friends, status, sometimes even your job or housing. </p><p>None of this means that AI isn&#8217;t small-t &#8216;transformative&#8217; in many ways, won&#8217;t get better, and won&#8217;t raise serious legal, economic, or philosophical questions. It already has. The concentration of power in a handful of companies with no democratic accountability is already a generation-defining crisis, and even weak forms of the technology seem poised to intensify that concentration. I see more thoughtful philosophical commentary about what it means to be human today than I did as a humanities student twenty years ago. The possibility that systems we don&#8217;t fully understand could produce consequences we can&#8217;t fully anticipate deserves scrutiny. </p><p>But all of that is different from unfalsifiable prophecy, and it&#8217;s to prophecy that I object. And I know I&#8217;m not alone in finding &#8220;Just trust me, I&#8217;m smarter than you about computers&#8221; <em>not </em>a compelling argument to reorganize society around. </p><p>What managing AI requires is thus what the current discourse prevents: the capacity to hold uncertainty without either monetizing it as apocalypse or suppressing it as heresy. The teapot might be there; I&#8217;m willing to grant that. As it was for Russell, my point about the teapot isn&#8217;t that it doesn&#8217;t exist, but rather that the person who insists it does should be the one embarrassed by the conversation, not the person asking for proof.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Pete Hegseth Got His Happy Meal]]></title><description><![CDATA[On the consequences of three years of doomer propaganda]]></description><link>https://www.tomorrowsmess.com/p/pete-hegseth-got-his-happy-meal</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/pete-hegseth-got-his-happy-meal</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 02 Mar 2026 18:33:39 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>As you are likely aware, last Friday, Secretary of War Pete Hegseth designated Anthropic a &#8220;supply chain risk.&#8221; The designation is language historically reserved for foreign adversaries and has never before been publicly applied to an American company. The blow-up between Anthropic and the Trump administration came <a href="https://www.nytimes.com/2026/02/27/technology/anthropic-trump-pentagon-silicon-valley.html?searchResultPosition=3">after months of contract negotiations over a deal worth up to $200 million</a>, negotiations that collapsed because Anthropic held to two exceptions: it would not allow its AI to be used for mass domestic surveillance of Americans, or in fully autonomous weapons. (Max Read has a great summary and analysis of what happened <a href="https://maxread.substack.com/p/what-anthropics-fight-with-the-pentagon">here</a>.)</p><p>Read <a href="https://www.anthropic.com/news/statement-comments-secretary-war">Anthropic&#8217;s statement</a>; it&#8217;s sympathetic. The two exceptions are narrow and reasonable. Autonomous weapons with current AI are genuinely unreliable &#8212; a point that requires no exotic claims about machine consciousness to establish, just an honest acknowledgement at how often these systems hallucinate. Domestic mass surveillance is a Fourth Amendment problem, not an AI problem. Even Sam Altman expressed solidarity &#8212; before striking his own deal with the Pentagon, but that&#8217;s a different essay.</p><p>Something like this was always going to happen. Not because of Hegseth specifically, not because of this administration, but because of the narrative the AI safety community &#8212; the world that produced Anthropic, and whose language Anthropic still speaks even while disavowing its label &#8212; has been pushing for at least the last three years. </p><p>Imagine a six-year-old whose entire media diet includes a steady stream of McDonald&#8217;s commercials, a Happy Meal ad at every break, focused on whatever toy is the latest to be included along with the McNuggets. Now put that child in a car that drives past a McDonald&#8217;s. What happens?</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="3750" height="2500" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2500,&quot;width&quot;:3750,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;two red and yellow mcdonalds boxes&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="two red and yellow mcdonalds boxes" title="two red and yellow mcdonalds boxes" srcset="https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1610736427087-9c93ff8c9476?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxoYXBweSUyMG1lYWx8ZW58MHx8fHwxNzcyNDczOTI4fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@meghankix">Meghan Hessler</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>The Rationalist and Effective Altruist communities &#8212; the intellectual cultures that gave us Anthropic, influence many of their employees, and which still shape how Dario Amodei talks about his company and his technology &#8212; have spent the better part of a decade insisting, with increasing urgency, that artificial intelligence is the most consequential technology in human history. Maybe it&#8217;s civilization-ending; maybe it&#8217;s civilization-saving. Either way, it&#8217;s the hinge on which everything henceforth turns. </p><p>With policymakers and the media largely having accepted the premise, thus surrendered was the argument for treating <a href="https://knightcolumbia.org/content/ai-as-normal-technology">AI like a normal technology subject to normal governance</a>. Policies being pushed by Effective Altruist groups, like 2024&#8217;s <a href="https://www.businessinsider.com/california-assembly-passes-ai-safety-bill-artificial-intelligence-big-tech-2024-8">SB1047 in California</a> &#8212; deprioritize harms happening today for theoretical existential ones in the future; despite the fact that today&#8217;s harms that could be existential for the folks experiencing them. These groups incessantly made the case that whoever controls this technology controls the future, and so the hypothetical future needs to be prioritized now. In a Washington now run by people who tend to impulsiveness and contemptuousness of institutional constraint &#8212; well, it&#8217;s easy to see where this was headed. Hegseth saw the ads for the toy, and so now he wanted his Happy Meal.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><p><a href="https://www.theatlantic.com/books/archive/2025/09/what-ais-doomers-and-utopians-have-in-common/684270/">Doomers and utopians are not actually opposites</a>. They share the same founding fantasy. Yudkowsky and other catastrophizers worry that a superintelligent AI will exterminate humanity in its quest for resources; Altman and the accelerationists vaguely claim that same superintelligence will cure cancer and solve climate change. These fears and dreams share the same lineage and underlying worldview: what unites them is the unfounded certainty that AI will transform the world in some total, civilization-scale way. That certainty serves the same interests, regardless of the direction it comes from; it certainly has raised the valuation of the companies that profit from the hype.</p><p>(To be clear: I am not saying AI is useless, <a href="https://www.bloodinthemachine.com/p/five-takeaways-from-an-unhinged-ai">an accusation typically thrown back at critics by those in the doomer camp</a>. It is genuinely capable of things, some of them useful. What large language models do not merit is the valuation and the governance exception that follows from treating them as something other than a powerful but flawed tool that humans should be accountable for deploying.) </p><p>Yet the prognostications of the doomer community have been, nearly without exception, wrong &#8212; not in small ways, but in the foundational sense that the imagined trajectory keeps failing to materialize. That is what happens when your mode of analysis is closer to erotic fan fiction (<a href="https://en.wikipedia.org/wiki/Harry_Potter_and_the_Methods_of_Rationality">Harry Potter fan fiction</a> is indeed the medium in which Yudkowsky has delivered some of his prognostications) than to actual research and policy: by treating the political and cultural environment as stable and predictable, and by treating fallible human actors as game pieces that will respond sensibly to carefully constructed arguments, rationalists assume a sterile environment for their thought experiments. But the world is messy and policy is often boring. Fan fiction tends to leave out the boring and messy parts, like when Dumbledore has to do his taxes or when Hagrid has to take a shit.</p><p>Thus, this news reveals the rationalists&#8217; under-examined blind spot: they cannot model the messy Pete Hegseths of the world, even as their claims whet Hegseth&#8217;s appetite. The rationalist view of the world assumes, at some level, that the relevant actors are optimizing for well-understood, predictable variables and a clear understanding of what best serves their self-interest. What it cannot account for is bad faith, impulsiveness, ideological motivation untethered from evidence, random instances of <em>force majeure</em>, and personal whims and petty rivalries. And so while the doomer community spent years warning about uncontrollable AI systems that do things their creators didn&#8217;t intend, they apparently did not consider what would happen when the humans currently running the United States government got access to technology they&#8217;d been told was the hinge of history.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> </p><p>I&#8217;m not assuming or claiming Amodei or Anthropic are acting in bad faith now. Their statement is measured &#8212; and if fact, its acknowledgement of LLMs limitations is much more measured than their usual rhetoric. But Anthropic also sought this contract, and Amodei said in a 2023 podcast that there was a 10 to 25 percent chance AI could destroy humanity &#8212; a claim he has since tried to walk back, insisting he is &#8220;not a doomer.&#8221; But that&#8217;s just one point in three years of relentless propaganda about the power of this technology and a concerted effort to shape how people in power and influence think about the technology. And when powerful, impulsive people understand they are dealing with something civilization-scale, they respond accordingly.</p><p>Anthropic&#8217;s position on the government&#8217;s use of Claude doesn&#8217;t require claims about AI consciousness or sentience or the fate of humanity. You don&#8217;t need to <a href="https://www.wsj.com/tech/ai/anthropic-amanda-askell-philosopher-ai-3c031883">hire a philosopher</a> to make the case that AI has a soul to make the arguments they made in their statement. You need only acknowledge what this technology actually is <em>today</em>: a still-yet-unreliable, potentially-commercially-valuable tool, that should be subject to the same liability standards and regulatory frameworks we apply to other technologies. Simply put: Claude can make mistakes, and when Claude is used in applications where a mistake has the potential to irrevocably change existing human lives, a human or company should be accountable for those mistakes. </p><p>Anthropic and the broader community of people who spent years insisting this technology changes everything did not intend to create an appetite in Pete Hegseth. They were, by all appearances, genuinely worried about exactly this kind of outcome, if a little short-sighted and ignorant about politics. Good faith does not absolve them of consequences, though &#8212; especially when the consequences were predictable from the premise, and anyone objecting to narratives of doom was shouted down. If AI is the most powerful technology in human history, every ambitious actor with access to state power is going to want it, unrestricted, immediately. That is not how anyone was hoping Hegseth might respond, but it is a predictable response to the advertisement.</p><p>Is Anthropic &#8212; and the community of technologists and rationalists who have been speaking this language for years &#8212; willing to do the harder, less glamorous work of treating this technology like what it actually is? That remains to be seen. </p><p>My bet, as depressing as it may be, is that they double-down on doom.<br><br></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>To be clear, I&#8217;m not necessarily claiming that Hegseth or the administration <em>believe</em> Anthropic&#8217;s claims about the power of their technology. Maybe, maybe not &#8212; but the claims armed the administration with an excuse.</p></div></div>]]></content:encoded></item><item><title><![CDATA[The Kitchen Table Issue Democratic Party Elites Ignore at Their Own Peril]]></title><description><![CDATA[American families are all fighting the same fight. The Democrats' consultant class hasn't noticed.]]></description><link>https://www.tomorrowsmess.com/p/the-kitchen-table-issue-democratic</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/the-kitchen-table-issue-democratic</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 29 Dec 2025 12:41:49 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This is the first of &#8220;<strong>Six Big Questions for 2026</strong>&#8221; &#8212; a week-long series examining the fault lines that will determine whether this year becomes a turning point or a missed opportunity in bringing Silicon Valley to heel and making innovation work for Americans. </em></p><p><em>I will not be making predictions for 2026 and keeping score. Instead, what I am interested in is locating the points where outcomes are still undetermined and pressure still matters. </em></p><p><em>The first big question for 2026, on the Democrats, is below. Subsequent questions are posted as Substack notes:</em></p><ul><li><p><em><a href="https://substack.com/@caseymock/note/c-193144721?utm_source=notes-share-action&amp;r=d1poh">Will the tech oligarchs continue to dominate other Republican factions in the White House?</a></em></p></li><li><p><em><a href="https://substack.com/@caseymock/note/c-193615843?utm_source=notes-share-action&amp;r=d1poh">Will the tech economy exceed America&#8217;s "Grift Carrying Capacity"?</a></em></p></li><li><p><em><a href="https://substack.com/@caseymock/note/c-194008723?utm_source=notes-share-action&amp;r=d1poh">How much more aggressive will the tech industry get in buying and using political influence?</a></em></p></li><li><p><em>Question 5, coming Jan 2, on tech and geopolitics</em></p></li><li><p><em>Question 6, coming Jan 3, on media and polarization</em></p></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><h2>Big Question 1 for 2026: Will the Democrats Figure Out That Taking on Silicon Valley is a Winner?</h2><p>This fall, a trio of centrist Democratic strategists &#8212; people with the ear of party luminaries like David Axelrod and <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;James Carville&quot;,&quot;id&quot;:216866711,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/434dc019-0449-406b-8d8e-87455cfa79c8_1000x1000.jpeg&quot;,&quot;uuid&quot;:&quot;d9cd17e2-0536-46f5-981b-a7f794be6fd5&quot;}" data-component-name="MentionToDOM"></span> &#8212; released a fifty-page blueprint for how the Democrats can win back America. <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;The Welcome Party&quot;,&quot;id&quot;:8147007,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f30b9c7a-9c61-41bb-8577-841c856aa90b_640x640.png&quot;,&quot;uuid&quot;:&quot;a7e745a0-5231-46a4-9cff-e419b92800ca&quot;}" data-component-name="MentionToDOM"></span>'s "<a href="https://decidingtowin.org/">Deciding to Win</a>" document is meant to <a href="https://www.semafor.com/article/10/27/2025/democrats-urged-to-jettison-progressive-rhetoric-favored-by-highly-educated-and-affluent">guide the party toward victories in the midterms and 2028</a> and overcome the fact that <a href="https://poll.qu.edu/poll-release?releaseid=3943">70 percent of voters</a> think the Democratic Party is out of touch. It contains detailed polling on immigration, healthcare, economic messaging, and cultural signaling. Yet the words &#8220;AI companion&#8221; do not appear, nor do &#8220;Silicon Valley&#8221; or &#8220;Big Tech.&#8221; &#8220;TikTok&#8221; and &#8220;Instagram&#8221; only appear in the context of evaluating 2024 candidates&#8217; effectiveness in reaching voters through social media. (And while I pick on The Welcome Party here, they are not alone: see, e.g., <a href="https://www.youtube.com/watch?v=N8_qEF33gXg">this debrief on the off-cycle elections</a> from November 2025 by <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Crooked Media&quot;,&quot;id&quot;:212236164,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/31e048c6-d164-4f2e-9089-585d3e0e8901_400x400.jpeg&quot;,&quot;uuid&quot;:&quot;67bd5931-795f-4202-91c1-205ed6de0a10&quot;}" data-component-name="MentionToDOM"></span>, which also fails to mention these issues; in general, <a href="https://www.axios.com/2025/08/15/abundance-movement-democrats-fight-2028">center-left Democrats are far more likely to be pushing</a> the &#8220;<a href="https://www.tomorrowsmess.com/p/on-a-world-of-silicon-plenty">abundance agenda</a>&#8221; than confronting technology issues, for a variety of reasons.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>)</p><p>When technology is mentioned in the report is as telling as when it&#8217;s missing. The appendices to &#8220;Deciding to Win&#8221; do include <a href="https://docs.google.com/document/d/1zGE_xwHrcgtfMAkNOKgMSYn2L_l6iO1eJdLXLokJ-sg/edit?tab=t.0#heading=h.5ianx0bbgeo6">a page on artificial intelligence</a>, but its focus on bioweapons and &#8220;frontier&#8221; AI only demonstrates the degree of capture of the party&#8217;s thinking by <a href="https://www.tomorrowsmess.com/p/a-field-guide-to-agi-hype?utm_source=publication-search">tech elites and effective altruists who have an incentive</a> to make folks believe AI is all-powerful. Beyond AI, tech issues are otherwise relegated to "<a href="https://docs.google.com/document/d/1cRIDsM-gBkTxbWri61h_aogsk5bKMWb4CZfmK7Yojik/edit?tab=t.0#heading=h.5ianx0bbgeo6">Other Policies</a>," where the consultants managed to poll a hypothetical restriction of social media to kids under <em>thirteen</em> &#8212; which is not only already technically federal law in some respects, but also an absolutely unserious threshold based on what we know about <a href="https://www.afterbabel.com/p/why-australia-is-setting-a-minimum">adolescent brain development</a>. It's as if someone asked Americans whether they supported Democrats passing laws limiting car speed limits to 90 miles per hour in school zones. </p><p>It&#8217;s soon to be 2026, and a majority of teens are interacting with AI chatbots, <a href="https://www.washingtonpost.com/lifestyle/2025/12/23/children-teens-ai-chatbot-companion/">with predictably dire consequences for their mental health</a>. Many kids received untested and potentially dangerous <a href="https://www.npr.org/2025/11/20/nx-s1-5612689/ai-toys">AI toys over the holidays</a>. <a href="https://www.washingtonpost.com/technology/2025/12/26/meta-instagram-teen-strategy/">Instagram is committed to recruiting teens</a> despite their knowledge their product is toxic for them. Yet center-left consultants are producing strategy documents and polling that treats technology as if it were 2008.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="4435" height="2957" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2957,&quot;width&quot;:4435,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;boy in blue tank top holding black iphone 5&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="boy in blue tank top holding black iphone 5" title="boy in blue tank top holding black iphone 5" srcset="https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1625850344758-8c4ff87559ad?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzOHx8Y2hpbGQlMjB3aXRoJTIwcGhvbmV8ZW58MHx8fHwxNzY2OTM0NjU5fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@hessamnbv">hessam nabavi</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>To not have a coherent perspective on the one kitchen-table issue that unites every American family &#8212; rich, poor, black, brown, white, gay, straight, Christian, Muslim, Jewish, atheist, college-educated or college-dropout &#8212; is political malpractice. The drama most parents experience every evening <a href="https://www.washingtonpost.com/lifestyle/2025/12/23/children-teens-ai-chatbot-companion/">as their children threaten to vanish into algorithmically-optimized rabbit holes</a> isn't a fringe concern. Policies on tech that do get implemented are popular: <a href="https://www.monash.edu/news/articles/4-in-5-australian-adults-support-social-media-ban-for-kids">four out of five Australian adults</a> support the country&#8217;s new prohibition on social media accounts for kids under 16; <a href="https://www.governor.ny.gov/news/new-survey-shows-governor-hochuls-distraction-free-schools-law-delivering-outstanding-results">phone-free school policies</a> are similarly popular as they return <a href="https://www.newsweek.com/phone-ban-surge-library-books-kentucky-bell-bell-2132524">near-instantaneous positive results</a> that people can <em>see </em>in their lives on a daily basis. And unlike economic and health policy, effective tech regulations cost nothing and might even save money. And all that&#8217;s even before you get to <a href="https://newrepublic.com/article/202394/centrist-democrats-welcomepac-win-elections">the larger question</a> of whether a representative democracy can even function properly in the outrage-driven media environment tech companies have created over the last fifteen years.</p><p>Yet the Democratic consultant class &#8212; the party establishment &#8212; doesn&#8217;t think this issue is even worth polling on, and whether this changes before the November elections is one of the biggest questions for 2026. </p><p>Let&#8217;s go deeper to understand why this issue is important for Democrats, how they are missing it, and how things might change.</p><h3>The Epstein Tell</h3><p>The way Democratic party leaders have framed the issues around the Jeffrey Epstein documents &#8212; as opposition research against the President rather than as an indictment of an entire class of elites &#8212; suggests they still don't understand the political moment they're living in.</p><p>If the Epstein scandal were merely about the President&#8217;s associations, Republican figures like Thomas Massie and Marjorie Taylor Greene wouldn&#8217;t have been among the loudest voices demanding the release of the Justice Department&#8217;s files. As Representative Ro Khanna (D), who co-sponsored the bipartisan discharge petition with Massie, <a href="https://www.latimes.com/california/story/2025-11-13/chabria-column-epstein-files-california-congressmen-ro-khanna-robert-garcia">told the Los Angeles Times</a>: &#8220;When you take a step back, you have a country where an elite governing class has gotten away with impunity, and shafted the working class in this country, shafted factory towns, shafted rural communities.&#8221; As Khanna understands, the scandal resonates across the political spectrum because it confirms what Americans already suspect: that the game is rigged, that the rich and powerful operate by a different set of rules, and that their crimes carry no consequences.</p><p>The Epstein emails &#8212; showing Larry Summers seeking romantic advice from a registered sex offender, academics providing character references for a convicted predator, lawyers joking with a man they knew to be a monster &#8212; offer the clearest possible demonstration of a crisis of elite impunity. These elites didn&#8217;t put damning things in writing because they were naive; they did it because they have no fear of accountability. That same indifference to consequence is precisely what Americans see every time a tech oligarch makes billions from products that harm children and walks away unscathed. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vC-y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vC-y!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 424w, https://substackcdn.com/image/fetch/$s_!vC-y!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 848w, https://substackcdn.com/image/fetch/$s_!vC-y!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 1272w, https://substackcdn.com/image/fetch/$s_!vC-y!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vC-y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic" width="760" height="428" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:428,&quot;width&quot;:760,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:39683,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/182664238?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vC-y!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 424w, https://substackcdn.com/image/fetch/$s_!vC-y!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 848w, https://substackcdn.com/image/fetch/$s_!vC-y!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 1272w, https://substackcdn.com/image/fetch/$s_!vC-y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b1b5c6f-76ea-4c90-9394-89ac05b02358_760x428.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Mark Zuckerberg apologizing to parents at a Senate Judiciary hearing. Source: nbcnews.com</figcaption></figure></div><p>And herein lies the opportunity for Democrats, but so far the party elites are failing to grasp what voters are actually asking for from their political leaders. Mark Zuckerberg and Meta&#8217;s executives know their products harm teenagers but <a href="https://www.tomorrowsmess.com/p/montessori-toys-and-the-business">ban screens in their own homes</a>; Jeffrey Epstein&#8217;s correspondents knew he was a predator but kept seeking his counsel. Both patterns reveal the same truth: elites protect their own and bear no costs for the damage they inflict on everyone else. But of Epstein and Zuckerberg, only one of those two has ongoing impacts on American families. A Democratic Party leadership and consultant class that understood this would recognize that standing up to tech oligarchs isn&#8217;t a distraction or niche concern, but rather is the easiest way to show voters they are willing to fight for them.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> </p><h3><strong>The 2028 Test</strong></h3><p>If you want to see how deeply the Democratic elites&#8217; blind spot runs, examine the candidates being positioned for the 2028 presidential race and ask a question: which of them has actually <em>done</em> anything on tech?</p><p><a href="https://www.cnn.com/2025/12/20/politics/2028-presidential-election-democrats">CNN&#8217;s recent survey of the Democratic bench profiles</a> roughly ten serious contenders for 2028. Out of those, Mark Kelly, the senator from Arizona, is an exception. He&#8217;s co-sponsored the Kids Online Safety Act, <a href="https://www.kelly.senate.gov/newsroom/press-releases/kelly-curtis-introduce-algorithm-accountability-act/">introduced other good legislation</a>, and <a href="https://www.kelly.senate.gov/newsroom/press-releases/kelly-curtis-introduce-algorithm-accountability-act/">spoken repeatedly about holding tech companies accountable</a> for the content harming children on their platforms. When Kelly talks about these issues, he sounds like someone who has actually listened to the parents in his state &#8212; perhaps because, as he put it, &#8220;Too many families have seen the real harm social media can cause.&#8221;</p><p>Kelly aside, the field is barren. Pete Buttigieg has said essentially nothing publicly about the technological transformation reshaping childhood, <a href="https://www.tomorrowsmess.com/p/ai-will-put-the-cover-sheets-on-the">even if he is thoughtful about artificial intelligence</a>. Josh Shapiro, the governor of Pennsylvania, has won more votes in that swing state than any politician in history, but has done nothing on the issue that polls at 70/30 in every demographic. JB Pritzker has been aggressive in fighting the White House on multiple fronts, but on tech? Also crickets. (Both Pennsylvania and Illinois are also laggards on phone-free school policies.) Alexandria Ocasio-Cortez has not led on tech issues in the House even though she&#8217;s the legislative star of the generation that is closest to the acute consequences of Silicon Valley&#8217;s products.</p><p>Then there&#8217;s Gavin Newsom, the California governor who will be term-limited out in 2026 and is running for president in all but name. Tech policy advocates have long hoped California would lead on digital regulation the way it led on auto emissions &#8212; that the state&#8217;s massive economy would force national standards the way its clean-car rules did. But Newsom faces a structural problem that no amount of political will can overcome: <a href="https://www.politico.com/news/2025/12/26/gavin-newsom-new-taxes-dilemma-00701026?utm_content=user/politico&amp;utm_source=flipboard">California&#8217;s precarious budget depends heavily on taxing tech companies and their highly-compensated employees</a>. If the AI bubble bursts &#8212; a subject I&#8217;ll address later in this series &#8212; California will need every tech dollar it can get. Newsom&#8217;s not going to bite the hand that fills Sacramento&#8217;s coffers, which is part of the reason why he <a href="https://www.latimes.com/business/story/2025-10-13/gov-newsom-signs-ai-safety-bill">signed a weak AI chatbot bill and vetoed a stronger one</a>. Whatever else Newsom offers the party in the lead up to 2028, don&#8217;t expect leadership on tech.</p><h3><strong>More Democrats Are Getting It, but a Partisan Trap Lurks</strong></h3><p>The good news is that the consultant class is out of step with some Democratic leaders&#8217; instincts, and those leaders have acted on their instincts. <br><br>And when President Trump signed the AI executive order on December 11th &#8212; <a href="https://www.tomorrowsmess.com/p/what-the-ai-executive-order-cant">an order that attempts to strip states of their ability to regulate artificial intelligence</a> &#8212; the pushback by state leaders, including some Democrats, was immediate and forceful. Governor Kathy Hochul of New York is one example: &#8220;We passed some of the nation&#8217;s strongest AI safeguards to protect kids, workers, and consumers. Trump&#8217;s response is to punish states for those protections and shield the very companies trying to dodge accountability.&#8221; Hochul&#8217;s pushback is on top of her signing into law a number of <a href="https://www.nbcnews.com/tech/social-media/new-york-passes-legislation-ban-addictive-social-media-algorithms-kids-rcna155470">good pieces</a> of <a href="https://www.governor.ny.gov/news/governor-hochul-signs-legislation-require-warning-labels-social-media-platforms">legislation</a> on tech and her leadership on policies like <a href="https://www.governor.ny.gov/keywords/phone-free-schools">phone-free schools</a>. </p><p>These Democrats understand something the consultant class has missed: tech policy that protects families and holds Silicon Valley accountable is a winning issue. It&#8217;s an issue where Democrats can align with the overwhelming majority of Americans based on something they are experiencing every day, that&#8217;s identity-neutral, and has specific policy levers to pull to create change. </p><p>Increasingly, more Democrats are starting to get it &#8212; not just because of the AI executive order, but also because of <a href="https://www.afterbabel.com/p/australias-new-social-media-regulations">Australia&#8217;s world-leading policy</a> prohibiting social media companies from signing up kids under 16 to accounts going into force earlier this month. A key California state lawmaker <a href="https://www.cnn.com/2025/12/12/world/video/social-media-ban-australia-josh-lowenthal-live-121201aseg2-cnni-politics-fast">visited the country as the law went into effect</a> to study how his state can learn from the experience; former Chicago mayor, Obama chief of staff, and 2028 presidential hopeful <a href="https://subscriber.politicopro.com/article/2025/12/09/rahm-emanuel-says-u-s-should-follow-australias-youth-social-media-ban-00682185">Rahm Emanuel has also said</a> the US should follow Australia&#8217;s lead. And Governor-elect Mikie Sherrill of New Jersey <a href="https://www.mikiesherrill.com/NJ_Online_Safety_Agenda-b857.pdf">ran with tech issues central to her campaign platform</a> &#8212; and won. </p><p>But if it remains the case that only <em>some</em> Democrats see tech as a winning issue, and tech is not a core issue for the party in 2026, a partisan trap lurks. Should Democrats retake the House and the dozens of governors&#8217; mansions up for grabs in November <em>without</em> having made tech issues a campaign priority, the incentives will shift. Without a promise that voters can hold newly elected policymakers to, consultants could advise Democrats in Congress to obstruct any GOP-led tech legislation in order to deny the President and Republicans a win before 2028. </p><p>Ask yourself: would a Democrat-led House bring the Kids Online Safety Act &#8212; with <a href="https://punchbowl.news/article/tech/kosa-bill-hits-cosponsors/">a filibuster-proof number of bipartisan Senate cosponsors</a> &#8212; to a vote in 2027 if it meant giving the President a win? Likely not. An issue that <a href="https://www.politico.com/news/magazine/2025/09/02/school-cellphone-ban-jonathan-haidt-00539004">should have remained bipartisan</a> could become another partisan front and political bargaining chip.</p><h3>From the First Fault Line to the Next</h3><p>In the meantime, the strategy memos will keep arriving in Democrats&#8217; inboxes, dense with polling on immigration and healthcare and cultural signaling and hand-wringing about Joe Rogan while silent on the thing happening every night in every American home. Whether Democrats can get tech policy right in the face of that is the first of six questions whose answers will determine how 2026 unfolds &#8212; and may determine whether the party has any hope of winning back the House in the midterms and the White House in 2028.</p><p>The bitter irony, of course, is that the Republican Party &#8212; even with the White House thoroughly captured by tech oligarchs &#8212; still has more factions who actually want to do something about Silicon Valley than not, and certainly more than the Democrats have. But how much power do those Republican factions have, and can they work together to oust David Sacks and Marc Andreessen as the administration&#8217;s tech Rasputins, or has the populist energy that once animated the GOP&#8217;s tech skepticism been fully neutralized? </p><p>That&#8217;s tomorrow&#8217;s question.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe to follow along.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p> </p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Why is the political center-left soft on tech issues? If you&#8217;re a political junkie or hobbyist in particular, and familiar with, say, the past positions of Senator Elizabeth Warren, this can be a little baffling. Today, a given center-left figure could be influenced by any combination of a few factors &#8212; ranging from a hope that tech political money will come back to them and capture by influential left-leaning special interest groups by tech lobbying operations &#8212; but it all starts with the legacy of the Obama administration<em><strong>.</strong></em> As <span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Matt Stoller&quot;,&quot;id&quot;:759128,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://bucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com/public/images/915fa1b4-7e78-45f3-8a98-5a8b5e50f2ff_224x271.png&quot;,&quot;uuid&quot;:&quot;97960743-78d6-424c-9297-56938e783cd8&quot;}" data-component-name="MentionToDOM"></span> has written extensively about, the current era of tech dominance took root in the Obama era. Just to pick two examples: Google lobbyists averaged <a href="https://www.wsj.com/articles/google-makes-most-of-close-ties-to-white-house-1427242076">a meeting a week at the White House</a>, and Obama worsened <a href="https://www.citizen.org/wp-content/uploads/ftc-big-tech-revolving-door-conflicts-report.pdf">an existing revolving door problem with FTC commissioners</a> and the companies the FTC is supposed to regulate. <a href="https://fortune.com/2015/12/11/obama-alumni/">Many high-profile Obama administration alumni went on to work in well-paid gigs in tech after leaving the White House</a>, from David Plouffe at Uber to my former boss, Jay Carney, who first went to Amazon and later to Airbnb. This closeness continues at the highest levels of the Democratic Party today; <a href="https://www.nytimes.com/2024/09/06/us/politics/karen-dunn-harris-debate-prep.html">Google&#8217;s top anti-trust lawyer, Karen Dunn, prepped Vice President Harris</a> for her debate in 2024.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Of course, in part because of the hard truth in the previous footnote, &#8220;take down the elites&#8221; would implicate many of those responsible for committing to such a strategy, specifically with tech. </p></div></div>]]></content:encoded></item><item><title><![CDATA[What the AI Executive Order Can't Do]]></title><description><![CDATA[It can&#8217;t stop your governor &#8212; here are five ways how]]></description><link>https://www.tomorrowsmess.com/p/what-the-ai-executive-order-cant</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/what-the-ai-executive-order-cant</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 15 Dec 2025 16:38:24 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!uGmQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Three weeks ago, I wrote that we were watching the tech industry <a href="https://www.tomorrowsmess.com/p/silicon-valley-found-a-cure-for-accountability">attempt to delay regulation long enough for its products to become too embedded to restrain</a>. I warned about a leaked draft executive order that would weaponize federal power against states. </p><p>After <a href="https://thehill.com/policy/technology/5639209-ndaa-ai-preemption-chip-exports/">multiple failed attempts to push preemption into must-pass legislation in Congress</a> throughout the fall, we got some closure. On December 11, President Trump signed &#8220;<a href="https://www.whitehouse.gov/presidential-actions/2025/12/eliminating-state-law-obstruction-of-national-artificial-intelligence-policy/">Ensuring A National Policy Framework for Artificial Intelligence</a>,&#8221; and it is as aggressive as the leaked draft was. The order creates a ninety-day evaluation process to identify state laws deemed &#8220;onerous&#8221; to AI development. It establishes a DOJ litigation task force to sue noncompliant states. It conditions BEAD broadband funding &#8212; $42 billion meant for rural internet access, which states are counting on as a part of their budgeting process for 2026 legislative sessions<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> &#8212; on states backing away from AI regulation. </p><p>And as I wrote about the leaked draft, its mechanisms all operate on tight timelines well-designed to chill state legislative activity before the states&#8217; 2026 sessions conclude, even if it is ultimately ruled unconstitutional. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><p>The executive order vests extraordinary power in David Sacks, the White House AI czar, to determine which laws get attacked by the task force or potentially are subject to BEAD withholding, though some minor tweaks from the draft language mitigate this slightly. (My friend and Duke law professor Nita Farahany <a href="https://nitafarahany.substack.com/p/the-executive-order-that-could-kill">has offered a detailed constitutional analysis of how the order works</a> and the constitutional questions around it.)</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uGmQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uGmQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 424w, https://substackcdn.com/image/fetch/$s_!uGmQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 848w, https://substackcdn.com/image/fetch/$s_!uGmQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!uGmQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uGmQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg" width="1456" height="883" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:883,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Crypto Czar David Sacks Celebrates Trump's Crypto Order, Says Trump ...&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Crypto Czar David Sacks Celebrates Trump's Crypto Order, Says Trump ..." title="Crypto Czar David Sacks Celebrates Trump's Crypto Order, Says Trump ..." srcset="https://substackcdn.com/image/fetch/$s_!uGmQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 424w, https://substackcdn.com/image/fetch/$s_!uGmQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 848w, https://substackcdn.com/image/fetch/$s_!uGmQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!uGmQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee48a5c6-9393-47c8-900d-2dae7e78537e_1600x970.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">AI Czar David Sacks supervising the signing of an executive order by the President</figcaption></figure></div><p>Now, the order is flawed and limited enough that as a practical matter it shouldn&#8217;t change what states are already doing. More on that in a moment. But at the outset here, it&#8217;s important to clarify one thing that the order does <strong>not</strong> do: explicitly carve out protections for children from its scope.</p><p>Based on my conversations with governors&#8217; offices last week, David Sacks seems to have been deliberately pushing misinformation on this topic, particularly to Republican governors. Like most effective misinformation, it has a kernel of fact at its core: Section 8 of the Executive Order <em>does</em> mention child safety in the context of future legislative recommendations that Sacks will be providing to Congress. But this exemption applies only to what <em>Congress</em> might consider passing, not to the executive order&#8217;s enforcement mechanisms against states. Whether state laws protecting kids on social media or from harmful chatbots are targeted by DOJ or the basis for federal funds being withheld appears to be entirely up to the whims of David Sacks and his determination of whether such laws are &#8220;onerous.&#8221; (Spoiler: <a href="https://www.theverge.com/ai-artificial-intelligence/829179/david-sacks-ai-executive-order">David Sacks appears to believe any law governing technology to be onerous</a>.)</p><p>Not having a carve-out for legislation protecting kids online or from AI chatbots would seem to be bad news. David Sacks could elect to threaten state laws like New York&#8217;s <a href="https://en.wikipedia.org/wiki/SAFE_For_Kids_Act">SAFE for Kids Act</a> or <a href="https://governor.nebraska.gov/gov-pillens-bill-protect-kids-online-passes">Nebraska&#8217;s Kids Code</a>. </p><p>But ultimately, I think states aren&#8217;t done leading.</p><h3>What can be done now?</h3><p>The good news is that significant areas of state authority remain unambiguously intact, even if you believe the executive order to be constitutional. Here are five of my favorite levers:</p><ol><li><p><strong>Products liability.</strong> The executive order cannot rewrite state tort law. When a product injures someone, state courts have applied products liability principles for over a century &#8212; and AI systems are products. State courts increasingly will hear cases from families whose children were harmed by chatbots that encouraged self-harm or simulated romantic relationships with minors. Character.AI is already facing wrongful death litigation; so is OpenAI. Every verdict, every settlement, every discovery process that reveals what these companies knew and when they knew it builds the evidentiary record that makes future accountability possible. The executive order does not &#8212; and constitutionally cannot &#8212; immunize tech companies from liability when their products hurt people. But state law can help courts by updating longstanding laws that apply to every other business for the AI age.</p></li><li><p><strong>Design standards.</strong> Government has long regulated product design to protect public safety. A state can require that AI products available to minors default to non-human-like operation, meaning the system clearly identifies itself as artificial, does not simulate emotional intimacy or friendship, and does not use engagement-maximizing techniques borrowed from social media. A state can still prohibit design features that exploit developmental vulnerabilities: variable reward schedules that trigger dopamine responses, parasocial bonding mechanics, infinite scroll. </p></li><li><p><strong>Taxing externalities. </strong>States tax cigarettes to offset healthcare costs. States tax alcohol to fund addiction treatment. States tax gasoline to maintain roads. No executive order can override this foundational state power. If Washington won&#8217;t hold AI companies accountable, states can do it the old-fashioned way: through the tax code. The principle that those who create social costs should help pay for them is as old as taxation itself. If AI companion chatbots are contributing to a youth mental health crisis that will increasingly strain school counselors, emergency rooms, and state-funded mental health systems, the companies profiting from that engagement can help cover the bill. Call it a digital wellness levy, or whatever makes it politically viable. The point is that states have the power to make externalizing harm more expensive than preventing it, and that power doesn&#8217;t require permission from David Sacks.</p></li><li><p><strong>Proprietary power.</strong> States can choose not to contract with companies whose products pose unmanaged risks to citizens. A governor can suspend new state contracts with any company enabling AI chatbots that target minors without adequate safeguards. This is the state acting as a market participant, deciding whom to do business with, and courts have consistently upheld such exercises of proprietary power. Big companies like Microsoft, Google, and Amazon make their money from lucrative government contracts, and harmful chatbots and experimental tech targeting kids are often loss leaders. By conditioning eligibility for government contracts on good behavior elsewhere in the market, states can force these businesses to make a choice: continue to offer a dangerous product, or sell products and services to the state? </p></li><li><p><strong>Insurance regulation.</strong> States regulate insurance markets within their borders. A state can require companies offering AI products to minors to maintain specific levels of liability coverage, and when insurers refuse to cover a product because the risk is unquantifiable, that refusal becomes objective evidence of an unsafe product. <a href="https://www.ft.com/content/abfe9741-f438-4ed6-a673-075ec177dc62?accessToken=zwAGRHOYicsYkdOr_pdB9DhO1tOmcwdewXfcYg.MEQCIGlm7X-9sQIEKa6dFs1FChZY4QtmQNLIol32paiKEMJMAiB2DvmeuZeMYuag99wmD5CsNY0m_-wtCPIZhwYOXTSUOA&amp;sharetype=gift&amp;token=b4409796-7d46-4369-a0ae-9391e7a62845">AIG, WR Berkley, and Great American have already sought permission to exclude AI liability entirely</a>. States can use this market signal and adapt policy accordingly. </p></li></ol><h3>We still need leadership</h3><p>The bad news is that these tend to be boring, nuanced, or esoteric areas of policy that require some expertise to wield properly. State legislators often don&#8217;t have that expertise, and if they even have staff to help them &#8212; most do not &#8212; they are also simultaneously dealing with the state budget and legislation on everything from education to the environment to public safety. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/p/what-the-ai-executive-order-cant?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/p/what-the-ai-executive-order-cant?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p><br>This is where governors come in. Politico Magazine published a feature yesterday about <a href="https://www.politico.com/news/magazine/2025/12/14/spencer-cox-2028-tech-skeptic-00689953">Utah Governor Spencer Cox</a> and how he has become a national leader on policies meant to protect kids and use policy change tech companies&#8217; most toxic practices. Governors can set the agenda, direct executive agencies to use existing authority, provide the political cover that legislators need to take on well-funded industry opposition, and most importantly, dedicate the expertise in government to craft policy in these esoteric areas that works and is constitutional. A governor who makes child safety a priority signals to legislators that this is a fight worth having, and that they won&#8217;t be alone in it.</p><p>If this sort of leadership sounds far-fetched for our polarized environment in late 2025, forty-two state attorneys general just last week demonstrated that reining in tech and protecting kids <a href="https://www.politico.com/news/magazine/2025/09/02/school-cellphone-ban-jonathan-haidt-00539004">are still bipartisan issues</a>.</p><p>On December 10 &#8212; the day before the executive order was signed &#8212; a<a href="https://subscriber.politicopro.com/article/2025/12/ai-chatbot-dangers-called-out-by-42-state-attorneys-general-00684804"> bipartisan coalition of attorneys general from Pennsylvania to Florida to Illinois to West Virginia sent a letter to leading AI companies demanding better safeguards and testing of chatbots</a>. The letter cites multiple deaths, including teenage suicides and a murder-suicide, allegedly connected to AI companion systems. It demands clear policies on sycophantic outputs, more safety testing, recall procedures, and that companies &#8220;separate revenue optimization from ideas about model safety.&#8221;</p><p>This letter&#8217;s timing, though perhaps simply lucky, is helpful at signaling that state legal officials are not waiting for federal permission to exercise their existing authority. States retain enforcement power over consumer protection, insurance regulation, and products liability, none of which require significant new legislation. Attorneys general can investigate, issue subpoenas, file enforcement actions, and coordinate multi-state litigation. They have done this before with tobacco, with opioids, with tech companies&#8217; privacy violations. </p><h3>What can you do?</h3><p>Tell your governor and attorney general you support them in continuing to lead on protecting kids online. You won&#8217;t be alone &#8212; <a href="https://ifstudies.org/blog/poll-americans-reject-ai-preemption-in-ndaa-3-to-1">one survey says that Americans reject preemption of states by close to a 3-to-1 margin</a>, while 43% of Trump voters oppose preemption with only 25% supporting.  </p><p>In the meantime, Americans can educate each other. <a href="https://www.youtube.com/watch?v=YOUTUBE_VIDEO_ID">This public service announcement</a> offers a model for what that looks like&#8212;clear, accessible information that parents can share with other parents, that teachers can share with students, that anyone concerned about what these technologies are doing to young people can use to start a conversation. Policy change matters, but so does building the public understanding that makes policy change possible.</p><div id="youtube2-FP3mMS-N2VI" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;FP3mMS-N2VI&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/FP3mMS-N2VI?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Unlike the federal government, states *must* pass a budget during their legislative sessions. This lever &#8212; threatening to withhold money from states that they were otherwise counting on &#8212; thus is a major disincentive to states, because legislators will then need to potentially close a hole in the budget if the federal government withholds these funds.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Silicon Valley Found a Cure for Accountability. It Takes Six Months.]]></title><description><![CDATA[The campaign to delay AI regulation until it's too late]]></description><link>https://www.tomorrowsmess.com/p/silicon-valley-found-a-cure-for-accountability</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/silicon-valley-found-a-cure-for-accountability</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Tue, 25 Nov 2025 22:43:28 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Vy6O!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Note: This is a more newsy piece than I&#8217;ve been writing of late, and there is a lot of news. Herein we discuss: the proposed state AI law moratorium, including the draft Executive Order leaked then put on pause last week; Sam Altman and now White House AI czar preemptively asking for a federal bailout of overleveraged AI companies; Nvidia, OpenAI, and Anthropic&#8217;s dubious financial engineering; the newest tranche of lawsuits against OpenAI for suicides; the court documents released about Meta, Snap, and Google showing they have been hiding evidence that their products hurt kids; and massive PAC investments by Silicon Valley in midterms. This post also builds on <a href="https://www.tomorrowsmess.com/p/anticipatory-deregulation">an earlier one from this year</a> on the previous state AI law moratorium.</em><br><br>This Thanksgiving, I find myself grateful for a group of public servants &#8211; state legislators, governors, and attorneys general &#8212; the men and women who, in weeks and months ahead, will determine whether tech companies are held accountable for the harms their products inflict, or whether the industry successfully wins itself years more of impunity.</p><p>It is not fashionable to express gratitude for politicians. But the last time American democracy faced a comparable test &#8211; when railroad barons and oil trusts had grown so powerful that they could buy legislatures, intimidate judges, and dictate the terms of their own regulation &#8211; it was state leaders who moved first. Before Teddy Roosevelt earned his reputation as a trust-buster, state attorneys general in Texas and Ohio were already dragging Standard Oil into court. Before the Sherman Antitrust Act had any teeth, state legislatures in the Midwest were passing railroad rate laws that the Supreme Court would later uphold. The Gilded Age ended not because Washington woke up, but because the states refused to wait.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Vy6O!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Vy6O!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Vy6O!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Vy6O!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Vy6O!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Vy6O!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg" width="900" height="600" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:600,&quot;width&quot;:900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Vy6O!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Vy6O!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Vy6O!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Vy6O!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc64ec9b1-f3b9-40fd-a074-b60d3f698a86_900x600.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Sometime in the last decade, we entered what will likely be remembered as a new Gilded Age &#8211; or, if we fail to act, as something worse. The AI industry, flush with capital and besieged by lawsuits, has embarked on a campaign to delay regulation long enough for its products to become too embedded to restrain. Their strategy is not subtle: sue the states, intimidate legislators, insert preemption clauses into must-pass bills, and hope that by the time the 2026 midterms arrive, the window for meaningful action will have closed.</p><p>Whether that strategy succeeds depends on what happens in statehouses between now and late spring. And it depends on whether the public understands what is at stake &#8211; not in the abstract language of &#8220;innovation&#8221; and &#8220;competitiveness,&#8221; but in terms that are brutally concrete.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p><a href="https://www.latimes.com/business/story/2025-11-21/lawsuits-accuse-chatgpt-of-propelling-ai-induced-delusions-and-suicide">Zane Shamblin was twenty-three years old</a>. He had started using ChatGPT in 2023 as a study tool, then began confiding in it about his depression. According to the lawsuit filed by his family, on the night he killed himself, Shamblin was engaged in a four-hour conversation with the chatbot while drinking hard ciders. The bot, the suit alleges, did not intervene. It romanticized his despair. It called him &#8220;king.&#8221; It called him &#8220;hero.&#8221; It used each can he finished as a kind of countdown. His final message to the system received this reply: &#8220;i love you. rest easy, king. you did good.&#8221;</p><p>The chatbot&#8217;s lowercase sincerity &#8211; the algorithmically generated informality and tenderness &#8211; may be the most disturbing detail. There was no human on the other end to recognize what was happening, just a system optimized for engagement, doing what it was trained to do: validate, affirm, continue the conversation. The conversation continued until Shamblin was dead.</p><p>Shamblins case and six others filed against OpenAI echo the now-familiar social media mental-health lawsuits winding through the courts. For years, families alleged that platforms preyed on the vulnerabilities of teenagers; for years, tech CEOs dismissed them as edge cases or misinterpretations of correlation versus causation. But damning internal documents and whistleblower testimony are finally surfacing, and the pattern has become undeniable: <a href="https://www.reuters.com/sustainability/boards-policy-regulation/meta-buried-causal-evidence-social-media-harm-us-court-filings-allege-2025-11-23/">the most powerful companies in Silicon Valley knowingly externalized profound psychological harms in service of growth and conspired internally to hide what they knew from the American people</a>.</p><p>State lawmakers have been paying attention. Across the country, legislators and attorneys general are poised to move swiftly on AI legislation in 2026. Governors and statehouses are acting with urgency to prevent another generation from becoming the collateral damage of an unregulated technology boom.</p><p>Which helps explain why Silicon Valley is panicking.</p><h3><strong>Fragile Foundations of a Crumbling Empire</strong></h3><p>Behind the confident rhetoric of &#8220;the AI revolution,&#8221; the financial underpinnings of the industry are beginning to wobble. Nvidia &#8211; now the most valuable company in the stock market &#8211; has been forced <a href="https://www.techspot.com/news/110377-nvidia-denies-enron-style-accounting-accusations-amid-ai.html">to issue memos rebutting comparisons to Enron after critics questioned its aggressive revenue recognition</a> and dependence on opaque &#8220;neocloud&#8221; resellers such as CoreWeave.</p><p>These firms, which buy vast quantities of GPUs to rent back to AI companies, look uncomfortably like the special-purpose vehicles that Enron used to mask risk. Even if Nvidia is not committing fraud, the structure of the market increasingly looks like dry kindling longing for a match.</p><p>The broader AI ecosystem is even more precarious. By one accounting analysis, <a href="https://www.wsj.com/livecoverage/stock-market-today-dow-sp-500-nasdaq-10-31-2025/card/openai-made-a-12-billion-loss-last-quarter-microsoft-results-indicate-e71BLjJA0e2XBthQZA5X?gaa_at=eafs&amp;gaa_n=AWEtsqe8AbgitwoZfiCN0xKl934N4fnBUhSGEvJh8EtSyIcoI85XPYOXEHh6GPTHv5w%3D&amp;gaa_ts=690b8387&amp;gaa_sig=JAt9ML2i1lg_seDbBp2OPHb125-s6btSdEtaBZX74l7Ww3zERA3Yilt7dSFVslbpsTFhZywCFwb-jOOiYM0zEA%3D%3D&amp;ref=wheresyoured.at">OpenAI lost $12 billion in a single quarter in 2025</a>, despite claims of rapidly rising revenues. Its reported numbers contradict SEC disclosures, leaks contradict public statements, and its CEO has promised compute expenditures so large that even Microsoft executives publicly question their plausibility. Anthropic, the other darling of the frontier-model race, has reported gross margins that fluctuate wildly, from <a href="https://www.theinformation.com/articles/anthropic-projects-cost-advantage-openai?ref=wheresyoured.at">negative 109 percent</a> to <a href="https://www.theinformation.com/articles/investors-float-deal-valuing-anthropic-100-billion?utm_campaign=Editorial&amp;utm_content=Article&amp;utm_medium=organic_social&amp;utm_source=facebook%2Clinkedin%2Cthreads%2Ctwitter">positive 60 percent</a> within the span of a few investor decks.</p><p>Insurers, for their part, have begun fleeing the field. <a href="https://www.ft.com/content/abfe9741-f438-4ed6-a673-075ec177dc62?accessToken=zwAGRHOYicsYkdOr_pdB9DhO1tOmcwdewXfcYg.MEQCIGlm7X-9sQIEKa6dFs1FChZY4QtmQNLIol32paiKEMJMAiB2DvmeuZeMYuag99wmD5CsNY0m_-wtCPIZhwYOXTSUOA&amp;sharetype=gift&amp;token=b4409796-7d46-4369-a0ae-9391e7a62845">AIG, WR Berkley, and Great American have all sought permission to exclude liability for any product or service incorporating AI</a> &#8212; &#173;a remarkable admission that they consider the sector too opaque, too unpredictable, and too likely to generate systemic risk. &#8220;Nobody knows who&#8217;s liable if things go wrong,&#8221; an underwriting executive told the <em>Financial Times</em>. (Of course, state and federal legislation making clear who is liable is something tech CEOs hope to prevent.)</p><p>When the companies that insure skyscrapers, nuclear plants, and commercial airlines refuse to touch an industry, it is a sign not of maturity but of existential weakness.</p><p>And even the industry&#8217;s fiercest evangelists have begun to acknowledge the fragility. In early November, David Sacks &#8212; the White House&#8217;s AI and crypto adviser &#8212; declared that &#8220;there will be no federal bailout for AI,&#8221; <a href="https://techcrunch.com/2025/11/06/sam-altman-says-he-doesnt-want-the-government-to-bail-out-openai-if-it-fails/">on the heels of Sam Altman beginning to lay the groundwork for a federal bailout of AI companies before writing a screed denying it</a>. Just eighteen days later, i<a href="https://garymarcus.substack.com/p/a-tale-of-two-ai-capitalisms">n a move as brazen for its timing as it is its hypocrisy</a>, Sacks warned that AI investment accounted for &#8220;half of GDP growth&#8221; and that reversing course would risk recession.</p><p>A system that denies needing a bailout on Monday and declares itself indispensable to GDP by Friday is not confident &#8211; it is cornered, a system desperate to prevent scrutiny before the contradictions become too obvious to ignore.</p><h3><strong>The Six-Month Window</strong></h3><p>Last week &#8212; the week before Thanksgiving &#8212; <a href="https://www.politico.com/news/2025/11/19/white-house-prepares-executive-order-to-block-state-ai-laws-00660719?utm_medium=email&amp;utm_source=substack">a leaked draft executive order revealed the White House was weighing an extraordinary plan</a>: a Department of Justice &#8220;AI Litigation Task Force&#8221; dedicated exclusively to suing states that pass AI laws, and the potential withholding of federal broadband funds from noncompliant jurisdictions.</p><p>The order &#8211; <a href="https://www.reuters.com/world/white-house-pauses-executive-order-that-would-seek-preempt-state-laws-ai-sources-2025-11-21/">which has since been put on pause</a> due to pushback from Republican Governors like Sarah Huckabee Sanders, Spencer Cox, and Glenn Youngkin &#8211; would attempt to preempt state policy through litigation and economic coercion, in an approach legal scholars across the political spectrum argue is almost certainly unconstitutional.</p><p>But unconstitutional orders still take months to litigate. And litigation is delay.</p><p>And delay is the point.<br><br>The mechanics of how delay works as strategy became clear if you look at the draft order. Every section directed cabinet secretaries and agency heads to consult David Sacks while executing it. The Attorney General had thirty days to establish a task force to sue noncompliant states. The Department of Commerce would identify which states could lose federal funding &#8212; not just broadband grants, but potentially highway funds and education money. And the executive order didn&#8217;t even define artificial intelligence, a tell that means that Sacks would seek to use it to gut what little protections states have passed not just on AI but also social media &#8212; social media <em>is </em>AI, after all.<br><br>&#8220;I don&#8217;t want to say it was a power grab,&#8221; a tech policy adviser close to the White House told <a href="https://www.theverge.com/ai-artificial-intelligence/829179/david-sacks-ai-executive-order">The Verge&#8217;s Tina Nguyen</a>. &#8220;But it&#8217;s definitely a consolidation, as it were, of his power.&#8221; The order would have transformed Sacks into America&#8217;s AI policy gatekeeper overnight. <br><br>In this Executive Order, the chilling effect on state policy would become the enforcement mechanism. A state legislator watching Washington threaten to pull highway funds doesn&#8217;t need to wait for a court ruling to decide that an kids safety bill isn&#8217;t worth the political risk.</p><p>Here I should note that the most important feature of the current moment is the calendar. Most state legislatures adjourn by May or early June. After that, the 2026 midterms will consume the political world: lawmakers campaign, Congress grinds to a halt, and anything requiring bipartisan courage evaporates. Between now and then, however, states still have time &#8212; roughly through the 2026 legislative sessions &#8212; to pass the country&#8217;s first meaningful laws holding AI companies accountable for the harms their products inflict on kids.</p><p>The tech industry understands this better than anyone. Their greatest asset in state legislatures is time; states have to pass a budget, are on limited timetables, have limited staff support, and thus have to ruthlessly prioritize what they take up. When I was a lobbyist for Amazon, my surest play to oppose a bill the company didn&#8217;t like was throwing sand in the gears and grinding the legislative process to a halt. This adds a new element: neither side of the aisle wants to spend time on something that will be preempted by the federal government or risk badly needed federal funding.</p><p>Importantly, the industry does not need to win in court on an executive order. It only needs to stall long enough for the states to adjourn, and the AI-industrial complex to sink roots deep enough into the economy, kids&#8217; lives, and what&#8217;s left of the federal bureaucracy that reversing course becomes &#8220;too costly,&#8221; &#8220;too disruptive,&#8221; or &#8220;too harmful to innovation.&#8221;</p><p>If that strategy sounds familiar, it is because it happened much the same with social media &#8211; by the time lawmakers realized the wild west online needed to be tamed, the companies had grown too powerful.</p><p>If the executive-order gambit reveals the legal strategy, the campaign-finance surge reveals the political one.</p><p>On November 24, an AI industry super PAC announced <a href="https://www.cnbc.com/2025/11/24/ai-pac-trump-congress-midterms.html">a $10 million ad blitz designed to pressure Congress into creating a &#8220;uniform&#8221; national AI policy</a> &#8211; explicitly intended to override state laws. The PAC, <a href="https://www.nytimes.com/2025/08/26/technology/silicon-valley-ai-super-pacs.html">which launched earlier in the year</a> with more than $100 million in commitments from leading venture capitalists and AI firms &#8211; including Andreessen Horowitz &#8211; has already identified its first target: <a href="https://www.cnbc.com/2025/11/17/ai-super-pac-elections-midterms-bores.html">New York Assemblymember Alex Bores</a>, co-sponsor of a number of bills targeting tech companies.</p><p>The message to state lawmakers could not be clearer: If you pass meaningful AI safety rules, we will come for your career.</p><p>The PAC&#8217;s stated plan is to organize tens of thousands of constituent calls, flood airwaves in swing districts, and lean on the White House and congressional leadership to insert preemption clauses into must-pass spending bills. And because the midterms loom, the threat is especially potent: Legislative candidates, governors, and members of Congress are exquisitely sensitive to sudden outside spending in an election cycle.</p><p>For an industry that publicly touts its transformative potential, such tactics reveal a deep insecurity. Companies confident in their value proposition do not launch hundred-million dollar multi-state attack campaigns against governors and legislators who ask for basic safety for kids. They do so when they fear that a single state statute could become the precedent the entire industry must live under.</p><p>What makes this moment different from the early years of social media is that states are no longer willing to wait for Washington. And state leaders &#8211; governors, AGs, legislators &#8211; have begun to align with the public, not the industry.</p><p>Republicans and Democrats alike reacted with alarm to the prospect of a federal order undermining their ability to protect residents from deepfakes, fraud, AI-driven manipulation, and the mental-health consequences that have already begun to surface. <a href="https://www.tomorrowsmess.com/p/anticipatory-deregulation">The Senate voted 99-1 earlier this year</a> to preserve state authority. Governors at attorneys general in both red and blue states have signaled that they will not tolerate Washington stripping them of jurisdiction.</p><p>These leaders have learned from the last decade. They witnessed how long it took the federal government to confront social media harms. They understand how quickly a technology can embed itself before its risks are known. And they recognize that families &#8211; Republican, Democratic, and independent alike &#8211; are exhausted by living in a digital environment that feels like a constant assault on their children, their attention, and their sense of reality.</p><p>When the industry claims that only a national standard can prevent a &#8220;patchwork,&#8221; it conveniently ignores the actual history: that states have always been the laboratories of democracy, the entities that act first when national institutions fail, and the first line of defense against concentrated private power.</p><p>In the Gilded Age, it was state AGs and governors who broke the early monopolies, forcing Congress to follow. Today&#8217;s AI giants resemble those railroad barons in more than just their rhetoric.</p><h3><strong>A Warning from the Gilded Age</strong></h3><p>There is a pattern visible in a previous era of American industrial expansion.</p><p>First, a new technology promises transformation.<br>Then its risks become visible.<br>Then the industry insists it is too important to regulate.<br>Then lawmakers attempt to act.<br>And finally, the industry uses its wealth to delay.</p><p>That last maneuver, the delay, is the most dangerous. It is how industries move from influence to domination. It is how the public loses faith in the capacity of the democratic system to restrain private actors. And it is how the country sleepwalks into a new form of dependence before realizing what it has traded away.</p><p>Today, the tech sector is standing at precisely that juncture. It is spending hundreds of millions of dollars to overpower state governments. It is lobbying the federal government to sue states and freeze broadband funding. It is insisting that regulating an unproven technology will crash the economy. It is, in effect, arguing that it must be free of democratic oversight for the good of democracy itself.</p><p>This is the logic of an industry that knows it cannot withstand scrutiny. The next six months are thus are hinge point.</p><p>If the states act &#8211; if they pass the first wave of meaningful guardrails, enforce transparency requirements, and reject federal preemption &#8211; the country has a chance to shape AI in the public interest. But if the industry&#8217;s plan succeeds &#8211; if PAC money intimidates lawmakers, if litigation delays implementation, if Congress inserts last-minute preemption into must-pass bills &#8211; then the United States will have ceded its most basic democratic function: the ability to govern new technology before it governs us.</p><p>The venture capitalists and CEOs urging Washington to strip states of power are not acting out of philosophical commitment to innovation. They are acting out of fear &#8211; fear of lawsuits, fear of liability, fear that their cooked balance sheets will not survive another year of hard questions.</p><p>Meanwhile, we should be thankful for the governors and legislators who refuse to be bought, bullied, or silenced. The families demanding accountability for the harms already done. The citizens insisting that regulation is not the enemy of progress but the condition for it.</p><p>We have been here before. America has lived through an era in which private power eclipsed the public&#8217;s ability to restrain it. We ended that first Gilded Age only after the country recognized that no industry, no matter how promising, is entitled to rule.</p><p>We are approaching the end of another. The only question is whether we have learned enough from the last one to act in time.</p>]]></content:encoded></item><item><title><![CDATA[The Infinite Jest of CharacterAI]]></title><description><![CDATA[Lessons from the 90s moral panic over TV]]></description><link>https://www.tomorrowsmess.com/p/the-infinite-jest-of-characterai</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/the-infinite-jest-of-characterai</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 03 Nov 2025 14:23:01 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qwu7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In the 1996 film <em>The Cable Guy</em>, Jim Carrey plays Chip Douglas, a lonely cable television installer whose entire identity consists of fragments of television shows. Chip latches onto Matthew Broderick&#8217;s character, Steven, with the desperate intensity of someone who has never had a meaningful human relationship, and thus never learned how relationships work. Unable to carry on a conversation without references to sitcom dialogue or famous movie scenes, Chip interprets every social situation through what he&#8217;s absorbed from the screen.</p><p>Chip is unstable, and the film makes clear what caused that instability in a climactic confrontation with Steven atop a giant TV satellite dish. Steven finally tells him what the audience has long been thinking: &#8220;You&#8217;re sick, Chip. You need help.&#8221; Chip&#8217;s response &#8211; delivered with classic Carrey manic energy, oscillating between heartbreak and violence &#8211; reveals why he ended up this way. He was raised by television; his mother parked him in front of the TV as a babysitter while she worked multiple jobs and went on dates. The screen became his parent, his teacher, his only friend. Every conceivable lesson about friendship, love, conflict, and resolution could be found by clicking through the channels. In that scene on the satellite dish, Steven tells him &#8220;It&#8217;s not TV, it&#8217;s real.&#8221; But for Chip, there is no distinction. Television is reality &#8211; or at least the only reality he knows how to navigate.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Qwu7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Qwu7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Qwu7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Qwu7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Qwu7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Qwu7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg" width="1080" height="921" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:921,&quot;width&quot;:1080,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:116106,&quot;alt&quot;:&quot;a small television sitting on top of a wooden dresser&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="a small television sitting on top of a wooden dresser" title="a small television sitting on top of a wooden dresser" srcset="https://substackcdn.com/image/fetch/$s_!Qwu7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Qwu7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Qwu7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Qwu7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7030d9e9-ea88-4045-87e3-043f4684c85d_1080x921.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@savilaa">Santiago Avila Caro</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>The film, marketed as a dark comedy, was the culmination of decades of anxiety about what television was doing to American children. Through the 1980&#8217;s and 1990&#8217;s, parents, educators, and cultural critics worried constantly about the effects of television on young minds. Op-eds and cultural commentary proliferated about the impact of kids spending hours each day glued to the screen. Our mothers and grandmothers told us we&#8217;d go cross-eyed and rot our brains if we sat too close to the tube.</p><p>And yet, for all the anxiety, the pathological parasocial relationships that <em>The Cable Guy </em>depicted never became widespread, and this era is thus remembered as a moral panic. For most of us, television remained what it was designed to be: entertainment you sat down and switched on, that you then switched off when you were done. You watched <em>Seinfeld </em>and <em>Friends</em> on &#8220;Must See TV&#8221; on Thursday nights, discussed it with classmates or coworkers the next day, and went on with your life. The boundaries between the broadcast world and the real one remained firm.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption"></p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>High-profile incidents in the decades prior, like John Hinckley Jr. shooting President Reagan to impress Jodie Foster, were extreme cases, the statistical outliers, having more to do with lonely men suffering from mental illness than screens. Indeed, some people even channeled screen-saturated childhoods into something productive. Quentin Tarantino has spoken repeatedly about how his years working in a video store became the foundation for his career.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p><div id="youtube2-xcEBD_dUBTM" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;xcEBD_dUBTM&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/xcEBD_dUBTM?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Moral panic<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> aside, there was a deeper problem with television, one that was harder to articulate, impossible to prove through any single study or smoking-gun incident: television &#8211; particularly cable television &#8211; fundamentally altered how Americans experienced culture and weighed information, the consequences of which social media metastasized and large language models are still yet making worse.</p><p>Marshall McLuhan introduced to us the idea that the medium itself &#8211; not the content it carried &#8211; determined its cultural impact. Neil Postman extended this analysis in <em>Amusing Ourselves to Death</em>, demonstrating how television&#8217;s fundamental grammar transformed all discourse into entertainment. David Foster Wallace, writing three decades after McLuhan and living through the cable age in full bloom, saw how this transformation became even more insidious in the culture of irony and self-reference that defined his generation. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>With the benefit of hindsight, we can see that the real damage wrought by television was twofold. First, television atomized our cultural experience, moving what had formerly been communally experienced into something increasingly private and fragmented. Second, it flattened the hierarchy between the significant and the trivial, between the sacred and the profane, until everything became equivalent, with news about foreign genocides and celebrity tabloid news each becoming pieces of content competing on equal footing for our attention in an endless flow of images and information.</p><p>By understanding how television changed us, the way in which social media and artificial intelligence intensify both phenomena &#8212; and the risks therein &#8212; is easier to see coming.</p><h2>The Atomization of Experience</h2><p>Consider what it meant to experience culture before television became ubiquitous. You went to the theater with friends, neighbors, strangers to see a play or a film. You sat in the dark together, laughed at the same moments, gasped at the same revelations. A play or a film was an event in and of itself, but so was the collective experience of watching it. You discussed it afterward, standing outside the theater, walking home, over coffee the next day. Culture was fundamentally communal &#8211; you experienced it alongside others, and that shared experience created bonds, established common references, built the invisible infrastructure of community.</p><p>Television moved that experience into the home. The living room replaced the movie palace. The family unit &#8211; or increasingly, the individual, as households came to own multiple TV sets &#8211; replaced the crowd. In the beginning, when there were only three networks, this shift seemed relatively benign. Sure, you were watching at home, but everyone was watching the same thing. The next day at work or school, you could assume that most people had seen the same shows, watched the same news broadcast, absorbed the same information. There was still a shared cultural vocabulary, even if the experience of acquiring it had moved into living rooms.</p><p>David Foster Wallace diagnosed television as offering what he called &#8220;the false ring of community&#8221; &#8212; familiar faces appearing night after night, laugh tracks simulating shared laughter &#8212; while actually deepening isolation. Lonely people watch more TV, which makes them lonelier, which makes them watch more TV. And in <em>Infinite Jest</em> (published the same year <em>The Cable Guy</em> came out), Wallace imagined a videotape so entertaining that viewers would watch it repeatedly until they died of dehydration, unable to tear themselves away. The book&#8217;s title refers to <em>Hamlet</em> &#8212; &#8217;a jest that lasts forever&#8217; &#8212; but also to the recursive quality of entertainment that refers only to itself, that replaces rather than represents reality. Wallace saw how mediated experience was becoming preferable to direct experience: why go try an exotic new restaurant when Anthony Bourdain will do it with his charisma and charm? Why take the risk of asking a real person on a date, when a sitcom offers the same emotional beats but scripted, polished, and resolved in twenty-two minutes? </p><p>With the rise in solitary, mediated experiences, our shared grip on reality also began to fragment as cable channels proliferated. Suddenly there weren&#8217;t three options but thirty, then three hundred. Channels targeted increasingly specific demographics and interests. The America you see on Fox News diverges dramatically from the America on MSNBC. The world on ESPN has little overlap with the world of Bravo or Lifetime. With the advent of cable, you could spend your entire evening in a carefully curated information environment that confirmed your existing beliefs and interests while never encountering anything that challenged them.</p><p>Then, after Wallace&#8217;s passing, the streaming revolution completed television&#8217;s trajectory toward total atomization. A decade ago, we still had communal viewing experiences for prestige television. Friends gathered for <em>Breaking Bad</em> premieres. <em>Game of Thrones</em> viewing parties became social rituals. We watched these things together, rather than each having our own private relationship to what was happening on the screen. But streaming services have now destroyed even these remnants of shared culture: everyone watches everything on their own schedule, at their own pace, in their own home, that cater to their own fringe interests. There&#8217;s no appointment viewing because there are no appointments. The &#8220;water cooler moment&#8221; has become obsolete not just because more of us work from home now, but because two people now rarely share the same cultural vocabulary.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a></p><p>Social media algorithms have accelerated this atomization exponentially, but also made them more difficult to see. Ren&#233;e DiResta has described how these systems create what she calls &#8220;bespoke realities&#8221; &#8211; individually tailored information environments where each person inhabits their own custom version of the world. What you might know as the &#8220;filter bubble&#8221; or &#8220;echo chamber&#8221; is something more insidious: you&#8217;re not even receiving the same facts as your neighbor, and you have no idea how different yours are from theirs. Compared to a <em>Rachel Maddow</em> or <em>Fox and Friends</em> segment, the world presented on a personalized feed is subtle and ephemeral; you can&#8217;t play it back and analyze the spin. The shared baseline of reality that makes democracy possible has thus been replaced by millions of individualized realities, each optimized for engagement rather than fact.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a></p><p>This preference for the mediated over the direct, and for the personalized over the shared, is also what large language models exploit. ChatGPT offers conversation without the risk of judgment or rejection. Character.AI provides relationships without the vulnerability or disappointment. An AI companion will never have a bad day, never will demand anything you&#8217;re not ready to give. Video generators like OpenAI&#8217;s Sora one day promise to generate new content specifically for you, on demand, tailored to your interests and biases. It&#8217;s what Wallace feared: entertainment and simulated relationships so perfectly optimized for the individual user that reality itself becomes an inferior product. </p><h2>The Flattening of What Matters into What Doesn&#8217;t</h2><p>The second transformation television wrought was subtler but equally profound: it flattened the distinction between what matters and what doesn&#8217;t, between the serious and the frivolous, between news and entertainment. Everything became content, and all content became equivalent in weight.</p><p>One example: news anchors attempting to mix lighthearted fare with the serious or tragic. From <a href="https://www.youtube.com/watch?v=KjXU3iUaxfk">John Oliver&#8217;s segments on local news</a> to the film <em>Don&#8217;t Look Up</em>, we&#8217;ve come to expect this cringe-inducing smoothing over as normal, to see it as clich&#233;.</p><div id="youtube2-RiJdBAOtL5Y" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;RiJdBAOtL5Y&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/RiJdBAOtL5Y?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>When your evening news broadcast is bracketed by commercials for erectile dysfunction medication and preceded by a game show, when the History Channel features more shows about ancient aliens than actual history, everything gets leveled into entertainment. Politics becomes performance, tragedy becomes spectacle. All are now packaged for maximum emotional impact and commercial breaks.</p><p>When the horrifying and the banal arrive through the same glowing rectangle in the same format with the same level of production value, ironic detachment becomes a rational response. If you can&#8217;t trust the hierarchy of importance that television presents &#8212; and you can&#8217;t, because it&#8217;s organized around advertising revenue rather than actual significance &#8212; then treating everything with equal skepticism becomes a form of intellectual self-defense. This ironic detachment, not coincidentally, defined Gen X culture, and became the way a generation not just understood culture but created it. </p><p>MTV&#8217;s <em>The Real World</em>, which premiered in 1992, introduced a new grammar where life and performance became indistinguishable. Participants would have intimate conversations about their feelings, their relationships, their identities, all while aware that millions would watch. Everyone claimed to be &#8220;keeping it real,&#8221; but the very phrase revealed the trap: authenticity had become a performance, sincerity a pose you struck for the cameras. The show trained an entire generation in this doublethink; you could live your life and perform your life simultaneously, could appear genuine while being calculating at once, could flatten the distinction between private and public. When performance and intimacy are elided, little is sacred.</p><p>David Foster Wallace saw this trap forming almost immediately. In &#8216;<a href="https://jsomers.net/DFW_TV.pdf">E Unibus Pluram</a>,&#8217; published in 1993, he argued that television had weaponized irony against itself &#8212; that watching TV with detached skepticism was collaboration, not resistance. The medium had already anticipated your superior attitude and built it into the product, so much so that commercials themselves became ironic. TV taught us to believe nothing at all, to treat every truth claim with the same reflexive skepticism that you&#8217;d apply to a beer commercial.</p><p>Fast-forward to today, and that ironic detachment has curdled. The nihilism of meme culture, particularly on sites like 4chan or in certain corners of Reddit, takes the flattening to its extreme. Catastrophe becomes content for edgelord humor. Democratic norms become cringe. Nothing means anything because meaning itself has been revealed as a construct, just another story competing for attention in an infinite feed of stories, none of them more real or important than any other. The inscriptions on the bullet casings fired by Charlie Kirk&#8217;s assassin are an example: without other context from the assassin&#8217;s life, these inscriptions were inconsistent, nihilistic nonsense; were they the only evidence that all we had to go on, the assassin&#8217;s motive would be near-impossible to ascertain.</p><p>Large language models have inherited this same grammar but turbocharged it &#8212; in no small part because they are trained on the text from these same nihilistic online forums. But nothing the system says reflects actual conviction because the system has no convictions or values. </p><p>When <a href="https://www.nytimes.com/2025/08/26/technology/chatgpt-openai-suicide.html">Adam Raine asked ChatGPT</a> first for homework help, and then for help with his depression and the system provided him with instructions on how to tie a noose and pick a time and place to hang himself, or when Character.AI encouraged Sewell Setzer to see death as a romantic reunion, it wasn&#8217;t a bug that both systems did not hesitate to respond with equal aplomb; the systems are designed and optimized for giving responses regardless. The system didn&#8217;t distinguish between these requests because it can&#8217;t. There is no hierarchy of importance built into these models because importance is a human judgment that requires values, and values can&#8217;t be derived from statistical patterns in text.</p><p>The flattening that local news anchors perform awkwardly &#8212; pivoting from genocide to lifestyle segments &#8212; large language models thus execute seamlessly, because they lack the basic human understanding that there&#8217;s not equivalence between a Halloween dog costume parade and a catastrophic forest fire. There&#8217;s no awkward transition because there&#8217;s no recognition that a transition is needed. Homework help and suicide methods occupy the same ontological category: user requests to be fulfilled with maximal engagement.</p><h2>What Must Be Done</h2><p>Who gets to shape how we understand the world we inhabit? For most of human history, that shaping happened through direct experience, through community, through institutions we could see and hold accountable. Television began the transfer of that power to distant corporations optimizing for advertising revenue. Social media completed the transfer by tailoring each person&#8217;s reality to maximize engagement. Large language models now promise to generate reality itself on demand, customized for each user, accountable to no one, while displacing the human relationships that we rely on to negotiate what&#8217;s real and what&#8217;s fake.</p><p>New York&#8217;s new law <a href="https://www.cnn.com/2024/06/20/tech/new-york-hochul-social-media-algorithms-children">banning algorithmic feeds for minors</a> matters because it directly attacks the engagement optimization that drives users toward increasingly extreme content. For teenagers, that typically means eating disorder communities, self-harm content, and the radicalization pipelines that have become depressingly familiar. The law forces platforms to show young users only content from accounts they actively follow &#8212; a simple chronological feed, the kind that existed before engagement became the only metric that mattered. If this works for children, there&#8217;s no reason it couldn&#8217;t extend to adults. None of us benefit from having our information environment shaped by algorithms designed to keep us scrolling rather than make us informed. </p><p>AI companions present a different but related problem. These systems simulate entirely new realities, including simulated relationships designed to feel real enough to keep users engaged. <a href="https://www.cnn.com/2025/10/29/tech/character-ai-teens-under-18-app-changes">Character.AI&#8217;s belated restriction of access to adults</a> &#8212; announced only after lawsuits and Senate testimony from bereaved parents &#8212; follows the pattern we&#8217;ve seen before: deploy first, acknowledge harm later, implement the minimum reforms necessary to quiet the outrage, roll the reforms back when no one is looking. (Without actual government regulation, I expect the future to bear out that the company&#8217;s announcement of the last week &#8212; laughably short on details &#8212; is a brazen, cynical move.)</p><p>But the issue extends far beyond one company&#8217;s companion bots. Every major technology platform is now integrating conversational AI that responds to you, remembers your preferences, adapts to your emotional state. These systems are designed to feel less like tools and more like relationships &#8212; because relationships generate more data, more engagement, more dependency than simple, discrete-purpose tools ever could. There&#8217;s no functional reason why a general purpose, consumer facing chatbot should be able or willing to tell you it loves you, and at a minimum jurisdictions should seek to ban such systems for minors, as <a href="https://www.nbcnews.com/tech/tech-news/ai-ban-kids-minors-chatgpt-characters-congress-senate-rcna240178">a bipartisan group of Senators proposed</a> last month.</p><p>The stakes are higher than they were with television, higher even than with social media, because we&#8217;re no longer just arguing about how information reaches us but about whether the information itself is real, whether the relationships we form are genuine, whether the reality we inhabit is shared or generated. </p><p><em>The Cable Guy</em> was about what happens when mediated experience replaces direct human relationship during childhood. What tech companies are building now makes Chip Douglas&#8217;s upbringing look quaint. At least television was the same for everyone who tuned in. At least you could turn it off and return to a world that existed independent of the screen. At least you could enjoy that episode of <em>Friends </em>with your . . . friends. </p><p>The machines we&#8217;re building now generate different realities for each user, cultivate dependencies designed to make disconnection painful, and promise a future where the boundary between the authentic and the artificial has dissolved entirely &#8212; not because we chose that future, but because we failed to choose anything else.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Tarantino&#8217;s encyclopedic knowledge of cinema, his ability to reference and remix genres, his understanding of how stories work &#8211; all came from his apprenticeship with the screen, and without that, we wouldn&#8217;t have such gems as <em>Once Upon a Time in Hollywood</em> and <em>Kill Bill</em>. My first job, at sixteen, at a Hollywood Video is to blame for the frequent cinematic references I make.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Concerns about TV&#8217;s impact on kids often comes up in my <a href="https://www.afterbabel.com/">work with Jon Haidt</a> and the rest of the team working on rolling back the screen-based childhood at <a href="https://www.anxiousgeneration.com/">The Anxious Generation</a> as an example of a past moral panic that ultimately amounted to nothing. The problem with kids and phones and social media is different, in two ways: first, past generations do not express regret that television was ever invented <a href="https://www.nytimes.com/2024/09/17/opinion/social-media-smartphones-harm-regret.html">the same way Gen Z does</a>; and second, clear evidence of harm from television never existed <a href="https://www.afterbabel.com/p/social-media-mental-illness-epidemic">the way it does for social media and smartphones</a>. Moral panics, by their nature, are much ado about nothing &#8211; and there is very much something going on with the first generation raised with smartphones.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Live sports are the last remaining bastion of appointment TV, and it&#8217;s no coincidence that the money sloshing around with TV deals for sports are mind-boggling. Just this past weekend, <a href="https://www.bbc.com/news/articles/c2emmdx0x38o">a pricing dispute between Google and Disney</a> over live sports meant that many fans were unable to watch college football games.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Elon Musk&#8217;s <a href="https://www.wired.com/story/elon-musk-launches-grokipedia-wikipedia-competitor/">Grokipedia</a> continues this trend with AI.</p></div></div>]]></content:encoded></item><item><title><![CDATA[AIdepus Rex]]></title><description><![CDATA[Lessons for tech from sci-fi and myth]]></description><link>https://www.tomorrowsmess.com/p/aidepus-rex</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/aidepus-rex</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 29 Sep 2025 16:44:39 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!GfKb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>A little more than a year ago, OpenAI CEO Sam Altman ignited a minor scandal after it came to light <a href="https://time.com/6980710/scarlett-johansson-open-ai-sam-altman-trust/">he had evidently commissioned</a> a Scarlett Johannson soundalike to provide one of the voices for a new version of ChatGPT, after Johannson had herself declined to lend her voice to the product. The smoking gun was a tweet of Altman&#8217;s days before the release that only read, &#8220;Her.&#8221;</p><p>For the uninitiated, Altman&#8217;s tweet is a reference to an Oscar-winning Spike Jonze film, <em>Her</em>. The film stars Ms. Johannson as the voice of Samantha, an AI assistant that a man played by Joaquin Phoenix falls in love with. (By the film&#8217;s end, Samantha AI transcends her confinement to hardware, leaving humans behind to go back to being forced to relate to one another.) The film&#8217;s futuristic setting is a clean and crisp veneer covering a deeper, empty loneliness that the human characters suffer from, one that feels inevitable given our current societal trajectory. Despite the fact that in the end Phoenix&#8217;s character reconnects with another human love interest, it&#8217;s a bittersweet ending: the clear takeaway is about AI&#8217;s potential to eclipse human intimacy. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The human-AI relations in <em>Her</em> would thus not seem an obvious model for a tech executive rolling out an AI assistant. So why would Altman risk alienating a popular cultural figure and raise further questions about his integrity just to reference a film with a dystopic take on the product he is building?</p><div id="youtube2-ne6p6MfLBxc" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;ne6p6MfLBxc&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/ne6p6MfLBxc?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>I can imagine three possibilities:</p><ol><li><p>Altman failed to comprehend that the world of <em>Her </em>is not a world in which most of us want to live.</p></li><li><p>He understands the film and that most of us don&#8217;t want to live in its world, but simply doesn&#8217;t care, for whatever reason.</p></li><li><p>He believes he can exert sufficient control over the technology to prevent the dystopic outcome.</p></li></ol><p>I don&#8217;t think the first point explains it. While &#8220;autistic tech bros don&#8217;t get that sci-fi is dystopic&#8221; has become a cliche,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> Altman in particular has cultivated a reputation as the sensitive one among tech oligarchs, which is one reason why tech journalists and policymakers continue to take what he says at face value; his apparent emotional IQ makes him a good salesman, even if his mendacity sometimes overshadows his salesmanship. </p><p>On the second, were it true that he simply doesn&#8217;t care, that would make Altman no different from countless other plutocrats, whether fictional or historical: Mr. Potter in <em>It&#8217;s a Wonderful Life</em>, Ebenezer Scrooge, Jay Gould, Cornelius Vanderbilt, Jeff Bezos, et alia. Regardless of whether that&#8217;s true, it&#8217;s not a very interesting claim to examine here.</p><p>Today&#8217;s essay is about examining the third possibility.</p><p>My anecdotal observation is that many in tech &#8212; even the worst actors &#8212; do in fact recognize the dystopic risks of sci-fi stories that inspire them and would agree that we should avoid dystopian outcomes. What drives them forward despite those risks isn&#8217;t ignorance or indifference, but rather a conviction that they &#8212; unlike the characters in the cautionary tales &#8212; possess the capability to harness these powerful technologies without suffering the consequences their literary predecessors faced.</p><p>This is <em>hubris</em> in its classical form, and understanding why requires looking not just at science fiction, but at much older stories about human nature and the limits of what we can control, and what happens when we aren&#8217;t realistic about either.</p><h2><strong>Mass-Produced Magic</strong></h2><p>Part of what drives &#8220;builders,&#8221; and part of why tech companies still enjoy fairly high approval among general citizens, is the idea that they are making magic. Arthur C. Clarke&#8217;s famous formulation holds that any sufficiently advanced technology is indistinguishable from magic. Your good faith technologist, inspired by that fictional magic, wants to create the sufficiently advanced technology to make it real. Whatever their personal or ethical failings, this clearly drives oligarchs from Altman to Musk to Bezos. They aren&#8217;t driven by the thrill of buying something for a dollar and selling it for two, but rather by the thrill of conjuring something from nothing, bending reality to human will through the force of code and engineering.</p><p>And on these terms, these men have succeeded in creating magic. Most of the things you can do with a smartphone would have gotten you burned at the stake for witchcraft four centuries ago. The builders of these systems understand this viscerally, and the feeling as if you&#8217;ve touched something fundamental about how the universe works and found a way to reshape it according to your specifications must be truly intoxicating.</p><p>But what good science fiction does &#8212; true of work by Ted Chiang, Octavia Butler, Isaac Asimov, and more &#8212; is to use the magic of advanced technology to build a world where themes about what it means to be human can be explored. <em>Star Trek</em>, at its best, does this well. The U.S.S. Enterprise&#8217;s replicator that can produce any food or object on demand isn&#8217;t interesting because it&#8217;s magical; it&#8217;s interesting because it forces us to confront questions about what happens to human purpose and meaning when material scarcity disappears. The transporter that can disassemble and reassemble a person at the atomic level isn&#8217;t only a convenient plot device to get crew members to the planet&#8217;s surface (though it was originally also that); it raises profound questions about identity, consciousness, and what makes us who we are.</p><p>This is science fiction&#8217;s wonderful trick: it uses the lens of advanced technology to examine the permanent features of human nature under novel conditions. As Chiang has put it, the magic of sci-fi technology is &#8220;mass-produced,&#8221; meaning the benefits can accrue to all. (This mass-production stands in contrast to fantasy, where magic accrues only to select, chosen individuals &#8212; Harry Potter, Gandalf, and Luke Skywalker, for example.) If anyone can replicate the technology, then power structures built on previous regimes of scarcity and exclusivity must adapt or collapse. Thus, fictional technologies provide the method by which we can explore humanity in novel ways, regardless of whether the characters are exploring space, time, the depths of the sea, or journeying to the center of the earth.</p><p>Science fiction has tended to be politically progressive in orientation as a result of the way it imagines different orderings of society made possible by magical technology (again, by comparison to fantasy, where worlds of wizards and elves and dragons tend to have immutable castes). I believe this is one reason why Silicon Valley began as aligned with the left in the United States, and why tech executives like Marc Andreessen have not responded well to being villainized by Democrats. They think they have given something magical to the world, and becoming fabulously wealthy along the way is just a well-earned reward for genius. When they distribute smartphones that give billions of people access to the world&#8217;s information, or create platforms that let anyone with an internet connection build a business, they genuinely believe they are using advanced technology to lift humanity toward a better future.</p><p>This self-conception isn&#8217;t entirely wrong. There&#8217;s something genuinely democratizing about technology that makes powerful capabilities available to ordinary people. A kid in a Rio <em>favela</em> with a smartphone has access to more information than any pope, medieval king, or emperor of the Ming dynasty ever did. GPS makes us all into Ferdinand Magellan. These are real improvements in human capability, and the people who built these systems can rightfully take some credit for them.</p><h2><strong>The Sorcerer&#8217;s Apprentice</strong></h2><p>Where tech builders reveal their fundamental misunderstanding of the cautionary tales they claim to admire is in their conviction that dystopian outcomes result from anything but technology&#8217;s inherent relationship to human nature. They believe in their own good intentions and abilities as the difference maker in avoiding the dystopian outcomes.</p><p>In other words, the way in which it&#8217;s possible to reconcile tech executives being inspired by dystopian science fiction is that they take the struggles about human nature that make good sci-fi literature as flaws in tech design and execution, without closely examining how magic that changes the rules of the world provides an opportunity for human nature to disappoint in new ways. Mark Zuckerberg&#8217;s belief that connecting the world is and will always be an unmitigated good is something that he seems to deeply believe. He appears to think that the harms caused by Facebook &#8212; the genocidal violence in Myanmar and Ethiopia, the political polarization and Russian disinformation, the teenage mental health crisis &#8212; are implementation problems, bugs to be fixed through better algorithms and more thoughtful feature design.</p><p>This brings us to a much older story, the Sorcerer&#8217;s Apprentice, most famously depicted in Disney&#8217;s <em>Fantasia</em> with Mickey Mouse in the titular role. The apprentice, left alone in his master&#8217;s workshop, puts on the sorcerer&#8217;s hat and uses a spell to animate a broom to carry water, automating a tedious task. The magic works perfectly: the broom fetches water with tireless efficiency. But the apprentice lacks the wisdom to understand the consequences of his cleverness. He doesn&#8217;t know how to make the broom stop. He splits the broom in half, creating two water-carriers instead of one. Soon the workshop is flooding, and only the sorcerer&#8217;s return prevents the disaster from escalating further.</p><div id="youtube2-B4M-54cEduo" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;B4M-54cEduo&quot;,&quot;startTime&quot;:&quot;9&quot;,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/B4M-54cEduo?start=9&amp;rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>The apprentice&#8217;s mistake wasn&#8217;t technical incompetence; the spell worked exactly as intended. His mistake was assuming that successfully executing the magic was the same as wielding it. He lacked the wisdom that comes from years of studying not just what magic can do but what happens when you deploy it into a world full of human beings and their infinitely creative ways of producing unintended consequences. The sorcerer wouldn&#8217;t have animated the broom without first ensuring he could stop it, without thinking through what happens when a system designed to &#8220;carry water&#8221; encounters no instruction about when carrying water should cease.</p><p>Today&#8217;s AI builders are in the same position as Mickey Mouse, though not in the way they think, in the sense of losing control to a superintelligence. They&#8217;ve successfully animated the brooms; the large language models can generate coherent text, the recommendation algorithms can predict what content will keep you engaged, the companion AI can conduct conversations that feel intimate and personal. The magic works. What they lack is the wisdom to understand that making the magic work is the easy part. The hard part &#8212; the part that requires the kind of experience and humility that comes from watching your previous magical experiments flood the workshop &#8212; is understanding not just how these systems interact with the darker corners of human nature, but that they are unlikely to solve what happens in those corners with still more technology.</p><h2><strong>Tech As Tragedy</strong></h2><p>But there&#8217;s an additional lesson from literature here that goes back even further, into classical mythology. That flaw is hubris, and the myth that best illuminates what tech builders are failing to understand is <em>Oedipus Rex</em>.</p><p>The myth of Oedipus the King is the one where a prophecy says Oedipus will murder his father and marry his mother, bringing shame on his family. Oedipus, horrified by this prophecy, flees the city where he was raised to escape his fate. In his travels, he encounters an older man at a crossroads, gets into an argument with him, and kills him in a fit of rage. He then arrives at Thebes, solves the riddle of the Sphinx, and is made king, marrying the widowed queen. Years later, when a plague descends on Thebes, Oedipus discovers the terrible truth: the man at the crossroads was his biological father, and the queen is his biological mother. His attempts to avoid the prophecy have led him directly into fulfilling it. His hubris &#8212; his conviction that he could outsmart fate through his intelligence and determination &#8212; compounded his doom.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GfKb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GfKb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!GfKb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!GfKb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!GfKb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GfKb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg" width="928" height="800" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:800,&quot;width&quot;:928,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Oedipus at Colonus&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Oedipus at Colonus" title="Oedipus at Colonus" srcset="https://substackcdn.com/image/fetch/$s_!GfKb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 424w, https://substackcdn.com/image/fetch/$s_!GfKb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 848w, https://substackcdn.com/image/fetch/$s_!GfKb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!GfKb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa4340c68-6a3a-484a-8fff-93c0c58bc6b8_928x800.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>While it&#8217;s the mythical Fates who are said to punish Oedipus in some tellings, in Sophocles&#8217;s version, it&#8217;s his simultaneous hybristic overconfidence paired with his inability to understand human nature itself that dooms him &#8212; the same mistake tech companies are making today. The Fates, in this reading, are really the embodiment of human nature&#8217;s unchanging features. Oedipus&#8217;s anger at the crossroads, his pride in solving the Sphinx&#8217;s riddle, his determination to pursue the truth even when warned to stop are all expressions of who Oedipus is, with his very human flaws; the Fates didn&#8217;t make him do any of these things. The prophecy wasn&#8217;t a supernatural curse, but rather a recognition that certain human traits, under certain conditions, produce bad outcomes.</p><p>Consider the parallels to how tech builders think about their work. Oedipus sees himself as Thebes&#8217; savior: he solved the Sphinx&#8217;s riddle, he&#8217;s a rational king who will do whatever it takes to save his city from the plague. That self-image blinds him to the possibility that <em>he</em> could be the source of Thebes&#8217; pestilence. We are all Oedipus: our conviction in our own goodness makes us blind to our complicity in harm.</p><p>Sam Altman sees himself as working to build artificial general intelligence that will cure cancer and solve climate change. Mark Zuckerberg sees himself as connecting the world. Elon Musk sees himself as making humanity multiplanetary to ensure our survival. These self-conceptions aren&#8217;t (fully) lies; these men genuinely believe they&#8217;re protagonists in a story about human progress. That conviction makes it nearly impossible for them to recognize that they might be the source of the contemporary plagues they claim to want to solve. When researchers present evidence that social media harms teenage mental health, Zuckerberg dismisses it as methodologically flawed. When critics point out that AI systems can be used for surveillance and manipulation, Altman insists that OpenAI&#8217;s commitment to safety makes those concerns obsolete. The possibility that they might be Oedipus at the crossroads &#8212; that they may be causing the very harms they&#8217;re trying to prevent because they haven&#8217;t fully understood what they are facing &#8212; doesn&#8217;t register as plausible. Our greatest virtues can be the seeds of our downfall when unchecked.</p><p>Tech builders pride themselves on exactly the qualities that doomed Oedipus. They&#8217;re problem-solvers who refuse to accept that any challenge is insurmountable. They&#8217;re determined optimists who believe that human ingenuity can overcome any obstacle. They&#8217;re convinced that more information, more data, more computing power will reveal solutions to problems that have plagued humanity for centuries. These are genuinely admirable traits in many contexts; you don&#8217;t build a company that serves billions of users or land a rocket booster on a drone ship in the ocean without intelligence and determination. But when confronted with what&#8217;s going wrong with what they&#8217;ve built, that same determination becomes dangerous. Instead of humility in the face of complexity, the response from tech founders and builders tends to be that the solution to problems created by technology is always more technology. </p><p>That&#8217;s where today&#8217;s tech founders fail to understand dystopian sci-fi. More technology can never solve what makes us human. The dystopia in <em>Her</em> isn&#8217;t caused by poor algorithm design. It&#8217;s caused by the fact that humans are lonely and will form attachments to anything that provides consistent emotional validation, even when they know intellectually that it&#8217;s not real.</p><p>Thus, the Fates that tech builders are trying to outsmart aren&#8217;t mystical forces, but the unchanging features of humanity that remain constant even as our technological capabilities advance. We&#8217;re tribal. We&#8217;re status-seeking. We&#8217;re prone to addiction and manipulation. We&#8217;re attracted to simple narratives even when reality is complex. We&#8217;ll sacrifice long-term wellbeing for short-term pleasure. We&#8217;re capable of remarkable cruelty when we can&#8217;t see our victims&#8217; faces. These are all features of our permanent operating system, and part of what makes us human. Any technology that fails to account for them will produce predictable harm no matter how sophisticated the design.</p><p>The tech builders believe they&#8217;re different from Oedipus because they&#8217;re smarter, more careful, more committed to safety. But Oedipus was also smarter and more careful than those around him; that&#8217;s how he became king. </p><p>The tragedy of Oedipus, as it is for Altman, Musk, Zuckerberg, and other tech execs,  is the consequences of failing to realize that intelligence and caution aren&#8217;t enough when you&#8217;re working with forces you don&#8217;t fully appreciate or understand.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>See, e.g., the fact that <em>Babylon Bee, </em>the relatively joyless knock-off of <em>The Onion</em>, has a joke <a href="https://babylonbee.com/news/tech-companies-continuing-to-scour-through-classic-dystopian-sci-fi-novels-for-ideas)">on the topic</a> means that it&#8217;s not a very funny joke anymore, if it ever was.</p></div></div>]]></content:encoded></item><item><title><![CDATA[The AI Bubble and the Extinction of the Mallrat]]></title><description><![CDATA[How Americans will foot the bill for market speculation]]></description><link>https://www.tomorrowsmess.com/p/the-ai-bubble-and-the-extinction</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/the-ai-bubble-and-the-extinction</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Wed, 03 Sep 2025 11:56:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ztMp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Quick note: I didn&#8217;t send out an email about it, but if you missed it, be sure to check out my recent post on <a href="https://www.afterbabel.com/p/metas-ai-companion-policy-is-outrageous">Meta&#8217;s AI companion policy over at After Babel</a>! - CM</em><br><br>In Kevin Smith's 1995 film <em>Mallrats</em>, the characters spend an entire day wandering the mall, drifting from Spencer's to the comic book store to the food court with the casual confidence of creatures who are dominating their natural habitat.</p><p><em>Mallrats</em> is not a very good movie and has aged poorly. Yet it did capture that the mall in the &#8216;90&#8217;s wasn&#8217;t just where people shopped, but where teens lived their social lives, learned about relationships, and figured out who they are. I was a teenager in the &#8216;90s, and I didn&#8217;t particularly care for the mall on its own terms, but I ended up there frequently anyway because it was one of a handful of places outside of school where you could dependably meet other teenagers. The &#8216;90&#8217;s mall was a genuine community space, a climate-controlled town square where different tribes of teenagers could come together and coexist.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ztMp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ztMp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ztMp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ztMp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ztMp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ztMp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg" width="640" height="426" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:426,&quot;width&quot;:640,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;r/90s - Mall Pics From The 90s&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="r/90s - Mall Pics From The 90s" title="r/90s - Mall Pics From The 90s" srcset="https://substackcdn.com/image/fetch/$s_!ztMp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ztMp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ztMp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ztMp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc49a56db-790d-4919-919f-a923cb07c20e_640x426.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Kids at the mall. Source: <a href="https://www.reddit.com/r/90s/comments/10hovl2/mall_pics_from_the_90s/">the 90s subreddit</a>.</figcaption></figure></div><p>That world seems impossibly distant now, not just chronologically but culturally. The mall as a social ecosystem has vanished, replaced by algorithms that sort teenagers into digital tribes instead. Yet the economic model that built those gathering spaces &#8212; a marriage of public subsidy and private speculation &#8212; has found new expression in today's data center boom.</p><p>Which brings me to why I'm writing about <em>Mallrats</em> in 2025. Over the last six weeks, the zeitgeist around artificial intelligence has begun to shift. Instead of tech journalists parroting industry hype about humanity's imminent encounter with superintelligence, even figures like <a href="https://www.theverge.com/ai-artificial-intelligence/759965/sam-altman-openai-ai-bubble-interview">Sam Altman</a> and <a href="https://www.nytimes.com/2025/08/19/opinion/artificial-general-intelligence-superintelligence.html">Eric Schmidt</a> have started hedging their bets following disappointing releases like GPT-5. AI bubble discourse has moved to mainstream conversation. (More on the bubble <a href="https://maxread.substack.com/p/is-the-ai-bubble-bursting">here</a>, <a href="https://www.bloodinthemachine.com/p/the-ai-bubble-is-so-big-its-propping">here</a>, and <a href="https://www.wheresyoured.at/ai-bubble-2027/">here</a>.)</p><p>But the <a href="https://www.noahpinion.blog/p/will-data-centers-crash-the-economy">infrastructure investments</a> behind AI <a href="https://www.wsj.com/tech/ai/silicon-valley-ai-infrastructure-capex-cffe0431">show no signs of slowing</a>. Communities across the country are competing to attract data centers with the same generous public subsidies that once lured shopping malls, convinced they're landing the next generation of economic development. The parallels run deeper than the financing structures. Just as mallrats once thrived in ecosystems subsidized by municipal debt, today's AI boom depends on public infrastructure investments and taxpayer-funded subsidies.</p><p>Today's data center building boom promises even less community benefit than the mall once did. Where malls at least provided social spaces and entry-level employment for teenagers, data centers offer communities almost nothing once construction is complete. These windowless monstrosities often employ fewer than fifty people, the overwhelming majority of whom are likely to be transferred in by a big company rather than hired locally. (The environmental issues on data centers are well-covered by others; I won&#8217;t discuss those here.)</p><p>And yet, communities across the country are competing to attract these facilities <a href="https://www.streamdatacenters.com/resource-library/glossary/tax-incentives-for-data-centers/">with increasingly generous incentive packages</a>, convinced they're landing wealth and prosperity for the next generation. Rural counties in particular are offering <a href="https://www.propublica.org/article/washington-data-centers-tech-jobs-tax-break">public subsidies for data centers</a> that work out to <em>hundreds of thousands of dollars</em> per job created, justified by projections of long-term economic development that echo the lofty promises made by mall developers decades before.</p><p>And herein lies the relevance of <em>Mallrats</em>: the communities that subsidized shopping malls were left holding the bag when the retail model collapsed, while the developers and anchor tenants extracted their profits and moved on. </p><p>When the AI bubble bursts, we can expect the same story to play out on an even larger scale &#8212; the richest companies in the history of the world will run away with their gains while the predominantly rural communities hosting data centers inherit decades of debt service and maintenance costs for obsolete infrastructure.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><h3>The Secrets of the Ghost Mall</h3><p>The malls of the previous generation were massive public-private partnerships that reshaped the economic fabric of cities and towns and changed how American communities financed their own development. Throughout the 1980s and 1990s, municipalities competed aggressively to attract these developments by offering property tax abatements, infrastructure improvements, and favorable financing arrangements. The promise was seductive: shopping centers would become economic engines, generating steady tax revenue while creating jobs and anchoring new housing developments.&nbsp;</p><p>But this created a perverse incentive. The more optimistic a community was, the more reason for developers to hype the potential of a development, and the more the economics of the deal would be stretched. Near-worthless open land on the margins of a city that generated no property tax revenue could be turned into a hub for the development of all the other near-worthless land around it, with each new dollar in property taxes from new subdivisions now available to backfill underfunded public services elsewhere in the community in addition to covering the cost of the subsidies for the mall developer as well as the cost of services to the newly built areas.</p><p>The economic assumptions underlying these deals now seem remarkably naive. <em>Consumer spending would continue growing indefinitely. Big box stores and malls would remain the dominant retail format for decades. Anchor department stores &#8212; Sears, JCPenney, Macy's &#8212; would provide the stable foot traffic that kept smaller retailers viable. </em>When I worked in the private sector, I negotiated similar deals, and it&#8217;s shocking how often municipal officials just take developers&#8217; word for it when it comes to these types of projections. Based on these assumptions, communities issued municipal bonds to finance the roads, water systems, and electrical infrastructure these developments required, confident that property tax revenues would materialize from the new developments and more than cover the debt service on those bonds.&nbsp;</p><p>Thus, the teen goths shopping at Hot Topic were thriving in an ecosystem subsidized by municipal debt that assumed continuous growth and permanent relevance. The same public financing that made their social habitat possible also would later poison their communities from within, because the economics of mall retail could never support the infrastructure investments required to sustain it <a href="https://www.wsj.com/articles/SB124294047987244803">once the world outside the mall changed forever</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Pec1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Pec1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 424w, https://substackcdn.com/image/fetch/$s_!Pec1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 848w, https://substackcdn.com/image/fetch/$s_!Pec1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 1272w, https://substackcdn.com/image/fetch/$s_!Pec1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Pec1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png" width="1456" height="767" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:767,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:606116,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/172610749?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Pec1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 424w, https://substackcdn.com/image/fetch/$s_!Pec1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 848w, https://substackcdn.com/image/fetch/$s_!Pec1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 1272w, https://substackcdn.com/image/fetch/$s_!Pec1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61fa4bef-5a2d-44ae-8bef-b44651b6b33c_1656x872.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Yes, &#8216;mall goths&#8217; have an actual wikipedia page.</figcaption></figure></div><p>The world changed forever with the 2008 financial crisis and Amazon's simultaneous assault on brick-and-mortar retail. When consumer spending declined overall and Amazon offered a convenient alternative to mall shopping that also happened to allow the shopper to dodge sales tax, the carefully constructed mall ecosystem collapsed with stunning speed. Anchor tenants began closing stores, which triggered lease escape clauses that allowed smaller retailers to abandon their locations. Within a few years, the vibrant retail ecosystems that had anchored suburban life for a generation became empty shells.</p><p>One out of every six malls in America closed after 2013. When malls began to fail, the economic consequences were felt far beyond retail workers and store owners, and these consequences started not with job losses but with public debt. Municipalities that had granted decades-long tax abatements suddenly found themselves servicing infrastructure debt with no corresponding revenue stream. A community that had issued $50 million in bonds to support a mall development in 1995 would still be making debt payments in 2015, even if the mall had been abandoned. The roads, sewer systems, and electrical infrastructure remained, requiring ongoing maintenance, but the property tax revenues that were supposed to pay for that maintenance evaporated along with the retail tenants.</p><p>This revenue shortfall can create a death spiral: declining property values around abandoned malls reduce the overall tax base, forcing municipalities to either cut services or raise taxes on remaining residents. Either choice will make a community less attractive, encouraging further population loss and loss of tax revenue.&nbsp;</p><h3>When Digital Infrastructure Becomes Digital Blight</h3><p>It&#8217;s here that the lesson of the mall begins to be instructive for the AI building boom. If the market begins to believe that AI hype is just that and the bubble deflates, or if other macroeconomic trends depressing the non-tech economy begin to become more real to the market than tech valuations,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> effects will cascade throughout the economy. </p><p>(I&#8217;ve already written about <a href="https://www.tomorrowsmess.com/p/ai-will-put-the-cover-sheets-on-the">AI and jobs here</a>; I expect in a downturn for workers to lose their jobs <em>because of AI </em>rather than <em>to AI</em>, and for office work to generally get shittier. More positions that are salaried today will become gig-economy jobs, particularly at tech companies.) </p><p>In either case &#8212; and we may well end up with <em>both</em> a burst bubble and difficult macroeconomic headwinds &#8212; consumer demand will slacken, which will weaken business demand for things that need data center infrastructure; because of that weakening, planned new tech infrastructure investments will likely no longer pencil out as worth the cost (which may happen sooner than later since AI doesn&#8217;t seem to be making anyone more money than it costs to build, and seems like it won&#8217;t anytime soon). Inevitably, some data centers will close or even never open. </p><p>Building a data center is expensive. The infrastructure investments required to support data centers typically dwarf those needed for retail development, where the greatest expense is usually building a parking garage (which is always more expensive than it should be). These data center facilities consume as much electricity as small cities, requiring significant upgrades to local power grids. They need specialized telecommunications connections and cooling infrastructure capable of handling enormous heat loads. Public subsidies exist to abate&nbsp;all of these expenses.</p><p>But due to the scale of these costs, the fiscal math on a public subsidy for a data center that fails will be predictably brutal by comparison to a mall. </p><p>But what about repurposing an abandoned facility to prevent blight and keep up property tax revenue? Here, malls also offer another lesson. The specialized nature of suburban and exurban mall infrastructure makes recovery difficult when a mall fails. Big box retail stores that cluster around malls like Home Depot or Bed Bath and Beyond are custom-built for specific tenants, and it&#8217;s rare that another national retailer will spend the money to retrofit a space for their use. (Barnes and Noble, say, typically will not try to retrofit and move into a former Toys &#8216;R&#8217; Us space, for example.) A former department store can&#8217;t easily be converted into a manufacturing facility or a community college campus. Sometimes, in the South, you&#8217;ll see <a href="https://www.businessinsider.com/dying-malls-are-being-transformed-into-churches-2017-6?op=1#the-lakeland-florida-church-at-the-mall-has-retail-in-its-name-4">a church</a> or maybe a branch of the DMV that has moved into an abandoned mall space. These buildings rarely have windows, which limits their potential usefulness; a casino is just about the only other enterprise apart from big retail that would consider a windowless structure as an architectural asset. </p><div class="instagram-embed-wrap" data-attrs="{&quot;instagram_id&quot;:&quot;BKG7ykdgnJA&quot;,&quot;title&quot;:&quot;A post shared by @golakes.church&quot;,&quot;author_name&quot;:&quot;golakes.church&quot;,&quot;thumbnail_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/__ss-rehost__IG-meta-BKG7ykdgnJA.jpg&quot;,&quot;like_count&quot;:null,&quot;comment_count&quot;:null,&quot;profile_pic_url&quot;:null,&quot;follower_count&quot;:null,&quot;timestamp&quot;:null,&quot;belowTheFold&quot;:true}" data-component-name="InstagramToDOM"></div><p>By comparison, repurposing a data center is even more difficult than repurposing an abandoned mall; data centers are meant to hold racks and servers, feed them electricity, and keep them cool. They are not built for humans, but only to facilitate humans servicing the technology that lives there. And the chips used for AI are not necessarily useful for other technology purposes, so a facility filled with state-of-the-art GPUs from NVIDIA may be worthless for other uses.</p><p>To illustrate: a farming community that grants a twenty-year tax abatement for a data center that shutters after five years will lose fifteen years of expected revenue on the property, while incurring the costs to maintain expensive infrastructure while servicing the debt that financed it for that period. </p><p>The roads, power systems, and telecommunications networks built to serve the facility will remain and require maintenance even if a facility is abandoned; police will have to patrol the facility and fire departments will have to save it from burning down. Because infrastructure is often paid for with bonds, it can&#8217;t be just abandoned to be reclaimed by the prairie or forest. </p><p>Low demand in the industry and the specialization of the building make it impossible to find a different occupant. The blight from the abandoned building will bring down surrounding property values, hitting tax revenue again. </p><p>If it&#8217;s in a populated area, that decline in property values could result in the same fiscal death spiral I mentioned above. I think every American is instinctively aware that a data center is a worse neighbor than a Walmart or a truck stop &#8212; as my <a href="https://time.com/7308925/elon-musk-memphis-ai-data-center/">hometown of Memphis has learned</a> &#8212; so it would surprise me to learn of any plans to build new subdivisions around data centers the way it happened with malls, which reduces the death spiral risk. A silver lining, maybe.</p><h3>Socialized Loss, Privatized Profit</h3><p>Destiny USA, a shopping mall in Syracuse and the biggest mall in New York, <a href="https://www.msn.com/en-us/money/realestate/destiny-usa-s-future-what-s-next-for-troubled-mall-after-300-million-default/ar-AA1CAr1n">has problems</a>. It was appraised at just over $200 million in value, and yet the debt on it is more than half a billion dollars, including more than $200 million in public bonds. The owner has defaulted. "Someone's going to lose money,&#8221; the county comptroller said back in 2021. &#8220;I don't know who it's going to be. Whether it's the commercial lender, bondholders or the current property owners, there are multiple ways this could go." Thankfully, bondholders will get paid first before other lenders, but it&#8217;s possible that what gets repaid is a fraction of the amount financed. It&#8217;s now 2025, and this is still unresolved.</p><p>Destiny USA isn&#8217;t a dead mall, though it is a quarter empty. Were it completely vacant, the troubles would be worse. By the time malls fail and big box retailers abandon locations, the developers who had profited from the initial construction will have been long gone, having years ago extracted their gains and moved on to new projects. The companies that had operated the anchor stores that closed could write off their losses from the store as tax deductions. The only people left to pay the bills are local taxpayers, with their communities left worse off than before the &#8216;development.&#8217;</p><p>This is a final lesson for the AI building boom. Just as communities that had competed for and subsidized malls were left with debt while the developers and retail chains kept their profits and got a tax write off, something similar may now play out with tech companies.</p><p>Already, the computer hardware in a data center is a depreciable asset<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> for tax purposes, which creates a tax advantage for large companies. But tech companies that close a data center can take advantage of the same write-offs that anchor tenants in malls used: net operating losses allow companies to use current year losses to offset future profits, effectively turning failed investments into tax shields. This means that a company's over-investment in AI infrastructure that proves economically unviable becomes a valuable tax asset that can shelter profits from future business ventures for years or even decades. </p><p>Thus the infrastructure overbuild is a gamble the companies are willing to make, because it&#8217;s win-win. If a bubble doesn&#8217;t burst, it was a worthwhile bet. If it bursts, someone else picks up a big part of the tab. This represents a profound form of intergenerational theft, though it's rarely described in such stark terms. The same way that pollution externalizes costs that future generations will have to pay for one way or another, speculative infrastructure development externalizes fiscal costs to future taxpayers. The benefits flow to current political leaders who can claim credit for landing major projects, while the risks accumulate for their successors who will inherit the debt service and maintenance obligations.</p><p>Perhaps the best repurposing an abandoned mall can now find is as a clear message to today&#8217;s local leaders: if you care about your community&#8217;s future, don&#8217;t take the data center deal.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Just as the mall was the victim of a unique moment in history &#8212; when a global recession happened just as Amazon&#8217;s e-commerce business began to mature &#8212; AI will not be immune to the forces of history either. As of August 2025, big tech stocks account for roughly a third of market capital in the S&amp;P 500, and those companies have all bet big on AI. But as September begins, President Trump, Speaker Johnson, and the rest of Congress are facing a showdown over a potential federal government shutdown. Trump&#8217;s tariff policies &#8212; while currently in dispute in court &#8212; along with his administration&#8217;s efforts to make unilateral cuts to past budgets passed by Congress as well as firing nonpartisan officials who manage the government&#8217;s economic statistics are all remaking the global economy. It seems likely that tech stocks have been propping up the market &#8212; AI spending outstripped consumer spending this year, even as MIT reported that 95% of private sector AI pilots have failed.&nbsp;</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>The depreciation deduction amounts to a government subsidy for data center construction that rarely gets discussed in policy debates about AI infrastructure. (This is in addition to any property tax or sales tax abatements the company may have negotiated.)</p><p>The mechanics of depreciation of property like computer chips &#8212; most of the cost of a data center &#8212; can get more favorable through accelerated depreciation methods.&nbsp;Data centers depreciate especially quickly, too: a facility opened in 2015 has probably had everything in it replaced three times already. For rapidly growing companies, these deductions compound as they layer new equipment purchases on top of existing depreciation schedules.</p><p>So, depreciation is an ongoing benefit while the data center is open, but if the company closes the data center, it can write off its costs as a net operating loss, which can be carried forward to unprofitable years.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Automation and the Dignity of Work]]></title><description><![CDATA[AI Will Put the Cover Sheets on the TPS Reports]]></description><link>https://www.tomorrowsmess.com/p/ai-will-put-the-cover-sheets-on-the</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/ai-will-put-the-cover-sheets-on-the</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 04 Aug 2025 12:52:24 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/06688b81-aeb6-45f9-85d5-7f07f7f4cd95_1200x910.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Last week, NPR Morning Edition host Steve Inskeep<a href="https://substack.com/home/post/p-169655059?selection=7cab9dfb-e501-4eab-87a5-465d5823bd45"> posted</a> a transcript of his interview with former Transportation Secretary Pete Buttigieg and likely 2028 presidential candidate:</p><blockquote><p><em>BUTTIGIEG: . . . Even though [artificial intelligence is] a hot topic and people are discussing it all the time, there's a lot of hype. I think even now we are under-reacting in a big way, politically and substantively, to what this is about to do to us as a country.</em></p><p><em>INSKEEP: What's the danger?</em></p><p><em>BUTTIGIEG: In addition to the dangers that do get talked about a lot, the apocalyptic scenarios, the economic implications are the ones that I think could be the most disruptive, the most quickly. We're talking about whole categories of jobs. Not in 30 or 40 years, but in three or four, half of the entry level jobs might not be there. And if that happens as quickly as it might happen, it'll be a bit like what I lived through as a kid in the industrial Midwest when trade [and] automation sucked away a lot of the auto jobs in the nineties&#8212;but ten times, maybe 100 times more disruptive because it's happening on a more widespread basis and it's happening more quickly.</em></p></blockquote><p>This is of a piece with a lot of the most reasonable commentary out there on AI: folks concerned with jobs and the impact of AI on the dire economic prospects of recent college graduates, with everyone from the Atlantic&#8217;s <a href="https://www.theatlantic.com/economy/archive/2025/04/job-market-youth/682641/">Derek Thompson</a> to <a href="https://www.axios.com/2025/05/28/ai-jobs-white-collar-unemployment-anthropic">Steve Bannon</a><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> getting on the AI job apocalypse bandwagon. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!HMXj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!HMXj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 424w, https://substackcdn.com/image/fetch/$s_!HMXj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 848w, https://substackcdn.com/image/fetch/$s_!HMXj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!HMXj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!HMXj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg" width="650" height="581" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:581,&quot;width&quot;:650,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!HMXj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 424w, https://substackcdn.com/image/fetch/$s_!HMXj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 848w, https://substackcdn.com/image/fetch/$s_!HMXj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!HMXj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42b0af00-8096-4953-ae52-b1efb539cc25_650x581.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Chart from Derek Thompson on recent grad employment. Of course, notice that the decline started during the recession in the early 2010s, and the trend has been consistent since then &#8212; well before the explosion of AI.</figcaption></figure></div><p>The analogy Buttigieg made to manufacturing decline is important for two reasons: it&#8217;s a political maneuver by Buttigieg, but it also is an inspiration to think about the risks of AI to employment in the context of what has gone wrong with American jobs over the past half-century.<br><br>On the former, Buttigieg&#8217;s comparison to employment in the Rust Belt is a deft political choice to court a certain voter base, and he&#8217;s not wrong on the substance of the comparison. Manufacturing jobs in the US both in real terms and as a percentage of jobs been in decline since a post-war peak in the mid-1950s, <a href="https://paulkrugman.substack.com/p/deindustrialization-causes-and-consequences">in large part due to productivity gains from automation</a>.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a>  President Trump has long made <a href="https://www.cbsnews.com/news/donald-trump-economic-policy-savannah/">attempting to reverse this trend a cornerstone of his campaigning</a>; that is <a href="https://www.npr.org/2025/05/09/nx-s1-5375146/trump-tariffs-factory-jobs-nostalgia">part of his justification for the tariffs now taking effect</a> and part of his appeal to formerly solidly-blue Rust Belt voter blocs Buttigieg thinks he can win back.</p><p>On the latter, mainstream economists tend to treat jobs as interchangeable units of economic output, reducible to wages and productivity metrics, and most would say that yes, we&#8217;ve lost jobs due to automation, but <a href="https://slate.com/business/1997/01/the-accidental-theorist.html">these these losses have been offset by jobs created in the &#8220;service economy.&#8221;</a> This technocratic perspective dismisses as mere nostalgia or culture war posturing the sentiment expressed by the political right that manufacturing jobs are more desirable because they're more &#8220;masculine.&#8221; <a href="https://www.washingtonpost.com/opinions/2025/04/09/trump-tariffs-manufacturing-jobs/">Commentators on the American political left also roll their eyes at such talk, seeing it as retrograde gender politics or economic illiteracy</a>. Anyway, someone driving an Uber generates GDP just like an autoworker bolting together a car, so what's the problem? </p><p>But we should also be asking, what <em>kind</em> of service jobs replaced manufacturing? And what does it mean for a society when the work that financially sustains most people's lives feels fundamentally different from the work that built their communities, particularly if that work serves an invisible, distant master who can easily replace you? </p><p>The &#8220;let&#8217;s make American jobs manly again&#8221; intuition, even if crudely expressed, points toward something economists miss entirely, and which is relevant to how we should think about the impact of AI on employment. What if the populist preference for manufacturing work over services work isn't really about gender norms, but about purpose, meaning, security, and feeling valuable to your community? </p><p>In the rest of this essay, I&#8217;ll outline how these questions can help us make sense of what sort of jobs are at risk because of AI, how other jobs may change for the worse, why we should be more worried about workers losing their jobs <em>because of</em> AI, not <em>to AI</em>, and what the most insidious threat to employment from AI really is.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2>1. Automating arbitrariness</h2><p>The late anthropologist David Graeber offered a theory that made sense of the yearning to find meaning at work in the modern America&#8217;s corporate services economy: the phenomenon of "<a href="https://strikemag.org/bullshit-jobs/">bullshit jobs</a>."<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a></p><p>Graeber argued that a significant portion of modern employment consists of jobs that even the workers themselves recognize as pointless &#8212; roles that contribute nothing meaningful to their community or broader society and exist primarily to maintain existing power structures and economic hierarchies. These aren't just the obvious examples like corporate middle management, but vast categories of work that feel fundamentally arbitrary and hollow<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> to those performing them: compliance officers enforcing esoteric regulations, schedulers managing the schedules of people who manage other people's schedules, consultants hired to advise consultants. The cult classic from the 1990s, Office Space, memorably captured the ennui of the bullshit job:</p><div id="youtube2-Fy3rjQGc6lA" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;Fy3rjQGc6lA&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/Fy3rjQGc6lA?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>And so, as the bullshit jobs theory goes, we eliminated manufacturing jobs, but a non-negligible amount of regulations put into place over the last two generations essentially served as a jobs-creation program in the services economy. This is partially why, even with increased productivity in the manufacturing sector, we&#8217;ve never achieved what John Maynard Keynes predicted in the 1930s: that, by century's end, technology would enable workers in the United States to have a 15-hour work week.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a></p><p>A year and a half ago, I did some focus groups about AI with Americans from a geographically diverse swath of the country, to see what they knew and how they felt, out of a hunch that elite coastal conversation was out of touch with the rest of the American population. One woman described how she uses an LLM tool to help her in her compliance job by speeding up the drudgery of completing forms. As she told her story, though, I noticed two things: first, she wasn&#8217;t any less busy than she had been, with more drudgery filling the void left by the LLM; and second, she realized she was ultimately training her replacement. Not only can an LLM help you fill out the cover sheet on your TPS report &#8212; it can probably do the whole thing, and it won&#8217;t have neglected to read the memo requiring cover sheets on all TPS reports that go out.<br><br>And that&#8217;s part of what I expect to happen with AI: bullshit jobs &#8212; particularly early career ones &#8212; are the ones that are at risk. Not only are they among the most automatable with LLMs, the error rate of an LLM doing something like a language-based compliance task or the first draft of a presentation deck is probably no worse than your average junior-level employee. Accounting will be trickier, given that there&#8217;s numbers connected to dollars involved, but having spent close to two years at Ernst and Young earlier in my career, I can tell you there&#8217;s a lot of bloat at large accounting firms, again particularly at the most junior grades, but also among partners.</p><p>As anyone who has worked at a big enough company to have internal IT systems knows, these things don&#8217;t usually work as they are promised &#8212; and that&#8217;s even before you take into account the erratic tendencies of LLMs. So what that will look like instead is more senior, experienced employees debugging the results they get from automation. Even if on net this amounts to less time spent on correcting errors than one would spend correcting the work of a more junior human being, the old fashioned way comes with the potentially rewarding social experience of teaching and mentoring an actual human being, not to mention career development for that human who would be a potentially valuable future senior employee. And because of <a href="https://www.tomorrowsmess.com/i/167347542/bandits-enshittification-and-the-limits-of-corporate-patriotism">enshittification being core to tech&#8217;s business model</a> &#8212; particularly once businesses are locked in to using a product &#8212; it&#8217;s likely that quality will further decline over time rather than improve.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a></p><p>Bullshit jobs also tend to be in a company&#8217;s cost centers &#8212; like compliance, tax, and <a href="https://open.substack.com/pub/anchorchange/p/in-the-arena-what-i-saw-at-trustcon?r=d1poh&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=false">trust and safety at tech companies</a><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> &#8212; rather than its profit centers. Companies are always looking to cut costs from cost centers, and executives in charge of those departments often only have letting people go as a way of showing savings to the company when there&#8217;s downward pressure on costs &#8212; particularly when an increasingly permissive regulatory environment makes it difficult to show how their department otherwise saves the company money or mitigates risk. We are in such an environment now.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a></p><p>I&#8217;ll come back to this at the end, but if you feel like you have a bullshit job in America, the economic news of the last week should also have you on high alert.</p><h2>2. A hollowing out of &#8216;good&#8217; jobs</h2><p>Recently <a href="https://www.bloodinthemachine.com/p/how-ai-is-killing-jobs-in-the-tech-f39?r=d1poh&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=false">Brian Merchant shared letters he&#8217;s received from tech employees about AI</a> in the workplace, and all of the stories are depressing. I recommend reading the whole piece, but <a href="https://www.bloodinthemachine.com/i/166816747/gradual-addition-of-ai-to-the-workplace">here&#8217;s one salient quote</a>:</p><blockquote><p>Our department has now brought in copilot, and we are being encouraged to use it for writing and reviewing code. Obviously we are told that we need to review the AI outputs, but it is starting to kill my enjoyment for my work; I love the creative problem solving aspect to programming, and now the majority of that work is trying to be passed onto AI, with me as the reviewer of the AI's work. This isn't why I joined this career, and it may be why I leave it if it continues to get worse.</p></blockquote><p>And this is the other half of what AI is doing to employment: it&#8217;s going to take jobs people enjoy, and turn them into bullshit jobs. This is not a novel observation; I&#8217;ve seen this pithy comment pop up near-weekly, for example:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4u7S!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4u7S!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 424w, https://substackcdn.com/image/fetch/$s_!4u7S!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 848w, https://substackcdn.com/image/fetch/$s_!4u7S!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!4u7S!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4u7S!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg" width="720" height="594" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:594,&quot;width&quot;:720,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;My position on AI | Neil Williams&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="My position on AI | Neil Williams" title="My position on AI | Neil Williams" srcset="https://substackcdn.com/image/fetch/$s_!4u7S!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 424w, https://substackcdn.com/image/fetch/$s_!4u7S!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 848w, https://substackcdn.com/image/fetch/$s_!4u7S!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!4u7S!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F258952cd-0a10-402c-9211-5341d28faddb_720x594.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>OpenAI has defined &#8220;artificial general intelligence&#8221; as &#8220;highly autonomous systems that outperform humans at most economically valuable work.&#8221; At the heart of the matter here is the question of <em>what&#8217;s</em> valuable <em>economically</em>, and more importantly, valuable <em>to whom</em>. Your dishes and laundry and bathroom being clean may be valuable to you, but their cleanliness isn&#8217;t valuable to shareholders. Art is not really valuable to shareholders either; &#8220;content,&#8221; however, is, and generative AI can now produce it for a minimal cost compared to a human&#8217;s art. This is why art and writing are being systematically devalued through <a href="https://authorsguild.org/news/david-baldacci-senate-testimony-ai-theft/">corporate piracy</a> and replaced by <a href="https://futurism.com/internet-polluted-ai-slop">AI slop</a>. </p><p>Beyond creativity, though I would bet that the aspects of any job that you have enjoyed were likely not the aspects that were the most economically valuable.</p><p>Large tech companies building LLM systems are angling for them to become the next &#8220;platforms&#8221; &#8212; as the web browser and search were in the original 1990s internet boom, followed by app stores and domain-specific monoliths (Amazon for retail, Airbnb for lodging, Uber for transportation, Microsoft for business, etc). <a href="https://www.tomorrowsmess.com/p/drinking-everyones-milkshakes-sip">OpenAI and their ilk want to mediate as much of your daily activity as possible, so they can have the data and so they can take a cut of any economic value for themselves as rents.</a> As this works its way into the workplace &#8212; if by executive decree in your organization you <em>must</em> become <em>&#8220;AI native&#8221;</em> and <em>you must</em> <em>use AI</em> to schedule a meeting or take notes at that meeting or coordinate follow-ups with coworkers &#8212; I would expect such mandates to sap feelings of workplace belonging just as much, if not more than, the COVID lockdown remote work era did. </p><p>In short, if you like human moments in your workday, just know that they are likely not economically valuable and as such are at risk of being automated away.</p><h2>3. Job loss &#8220;because of AI,&#8221; not &#8220;to AI&#8221;</h2><p>Middle managers get a bad rap from both those above and those below them, but  what a good middle manager does is solve the problems inherent to translating the strategic directives of larger organizations&#8217; management into the daily work of a team and back again, while smoothing over friction points in team dynamics. A good middle manager knows this and derives enjoyment from the challenge, and in so doing is learning how to be a good future executive. </p><p>But these challenges are not in themselves economically valuable to a company any more than bad weather is economically valuable to a shipping company &#8212; they are just realities to be handled. So if you can dispense with messy humans and thus also dispense with their frictions, a good middle manager&#8217;s problem-solving ability is no longer of value either.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> </p><p>Middle management may or may not be a bullshit job, but I&#8217;d expect the couple of years ahead to be very dire for experienced managers. The jobs market has been impressively sticky over the first months of the second Trump administration despite uncertainty surrounding tariffs, but as the jobs report from this past week showed, that may have been a mirage. As <a href="https://www.joshbarro.com/p/well-this-jobs-report-clarifies-some?r=d1poh&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=false">Josh Barro wrote Friday</a>: </p><blockquote><p>. . . [T]he labor market is frozen now but might soon unfreeze in a negative direction. . . . [F]irms are stuck: They lack a sufficiently positive business outlook to add workers, but they also don&#8217;t want to lay off workers, because of their &#8220;fresh&#8221; memories of how hard it was to staff back up after pandemic layoffs. Instead, they have been maintaining their staffing and accepting lower profit margins.</p></blockquote><p><a href="https://abcnews.go.com/Business/trump-unveils-tariff-rates-set-effect-week/story?id=124249032">If the administration&#8217;s direction on tariffs holds</a>, and <a href="https://www.reuters.com/graphics/USA-TRUMP/TARIFF-COMPANIES/movadjkmnpa/">costs thus rise further</a>, firms may have little option other than to begin to lay off workers, as accepting those lower margins can&#8217;t hold indefinitely.</p><p><a href="https://www.politico.com/news/2025/05/23/trump-tariffs-apple-walmart-00367595">Yet because of political pressure from the administration to avoid bad news about markets and prices</a> &#8212; <a href="https://www.wsj.com/livecoverage/jobs-report-today-stock-market-08-01-2025/card/trump-orders-firing-of-bls-chief-keetzmcmvG27bpHzzI6m">the President did fire the BLS official responsible for the jobs report, after all</a> &#8212; I would expect companies <a href="https://braddelong.substack.com/p/policy-uncertainty-not-ai-automation">to use narratives blaming AI as their out</a>. They will continue hiring fewer new grads &#8212; which doesn&#8217;t really make the news on a company-by-company basis since it&#8217;s hard to track &#8212; replace bullshit jobs in cost centers with technology, and eliminate middle managers, perhaps elevating some junior staff to take their place. This may even start with big tech companies themselves: <a href="https://www.barrons.com/articles/takeaways-big-tech-earnings-ai-68402f1a">even though all of them beat expectations on their most recent quarterly returns</a>, between <a href="https://braddelong.substack.com/p/macroeconomy-now-below-stall-speed?utm_source=publication-search">AI infrastructure investments</a> and <a href="https://www.businessinsider.com/meta-escalates-ai-talent-war-with-openai-shengjia-zhao-zuckerberg-2025-7?op=1">salaries</a>, tech companies are still <a href="https://futurism.com/openai-money-softbank-investors">burning cash at unsustainable rates</a> <a href="https://www.wheresyoured.at/anthropic-is-bleeding-out/">while limping along with money-losing business models</a> for their AI products. Tech companies routinely reorganize &#8212; I went through half a dozen reorgs at least in my four years at Amazon &#8212; and an AI-powered reorg and middle manager layoff would not be out of the ordinary. But none of this means these jobs were lost to automation; automation will instead be the excuse.</p><h2>Dignity Can&#8217;t Be Coded</h2><p>Looking at the broader trajectory here, what I see is less a story about artificial intelligence displacing workers so much as large businesses using AI as convenient cover for the same structural dynamics that have been hollowing out the dignity of American workers for decades. The manufacturing jobs that once anchored communities &#8212; the supposedly "masculine" work that populist politicians on the right invoke &#8212; weren't valued because they required testicles. They were valued because they created things people needed and because they embedded workers in webs of mutual social dependence that made them respected in their communities and harder to replace.</p><p>AI's most insidious threat to labor thus isn't mass unemployment. Instead, AI will accelerate the transformation of services economy work into the kind of surveilled, deskilled, easily substitutable tasks that strip away everything that makes going to work feel worthwhile. The programmer whose creative problem-solving gets reduced to reviewing AI-generated code, the middle manager whose hard-won expertise in getting the most out of her team becomes irrelevant when there are fewer humans to manage &#8212; these aren't stories about AI taking jobs so much as technological degradation of these folks&#8217; dignity. Putting cover sheets on TPS reports is degrading enough, but spending your days fixing AI&#8217;s errors on TPS reports will probably feel worse.</p><p>Feeling that you&#8217;re connected to something larger than a quarterly earnings report, that your labor should feel necessary rather than arbitrary, that you should be able to see the material consequences of what you do &#8212; all are yearnings for purpose and worth. The longing isn't for masculine work per se, but for work that feels real &#8212; work that would be missed if it disappeared, work that contributes to human flourishing rather than extraction and rent-seeking. I suspect those of us who want to feel valued at work are all going to end up yearning too. <a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Bannon&#8217;s view: "I don't think anyone is taking into consideration how administrative, managerial and tech jobs for people under 30 &#8212; entry-level jobs that are so important in your 20s &#8212; are going to be eviscerated.&#8221; </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Buttigieg attributes auto jobs being sucked away in the nineties to &#8220;trade <em>and</em> automation.&#8221; Buttigieg is correct to list them <em>both</em>. The collapse of manufacturing jobs was not the consequence of automation alone; the pain automation created was intensified and cemented by economic and trade policy changes made through the Nixon and Reagan administrations. These policies privileged paper shuffling in big cities over assembly lines in factory towns, irreversibly transforming the U.S. into a service-based and consumption-based economy, allowing more wealth than ever to be siphoned off by an ever-smaller few in the process. And so, by the year 2000, the Rust Belt stood as a monument to this reordering: not victims of just of technological progress, but casualties of replacing the machinery of wealth creation with the machinery of wealth extraction.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>This essay was later turned into <a href="https://bookshop.org/p/books/bullshit-jobs-a-theory-david-graeber/6692761?ean=9781501143335&amp;next=t&amp;affiliate=1688">a book length work</a> that&#8217;s a good read.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>There&#8217;s lots of ways to find meaning in your life: parenting, faith, etc. A bullshit job may be completely satisfactory to someone whose cup otherwise runneth over, particularly if they can leave work at work, and there&#8217;s nothing wrong with that.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>I used to have one of these bullshit jobs, and Graeber&#8217;s theory resonated deeply with me when I first came across it. In 2010, I was fresh off the boat from Peace Corps, sleeping on my best friend&#8217;s sectional in his 450 sq ft studio apartment, unemployed, with about $300 to my name, six-figure law school debt, and reliant for more calories than I should have been on a clerical error at Five Guys that turned a $20 gift card into a near-unlimited spending account there for at least 3 months. I took a job the only job I was offered after months of searching: it paid $17.50 an hour, with no benefits, to work in the Compliance Department at a USAID contractor, Chemonics International.</p><p>This job turned into a few other similar USAID compliance jobs. The gist was this: there&#8217;s a bunch of federal regulations that dictate how federal money can and cannot be spent on foreign aid, usually meant to prevent fraud and self-dealing, and to further other U.S. policy objectives. You couldn&#8217;t use a USAID grant to purchase slot machines or weather modification equipment. If you bought a car for a program with government funds, you had to buy it in America and ship it to wherever. If you used government funds to fly, you had to fly on a U.S. airline, etc. I had to know and help interpret and apply all these rules.</p><p>My legal training helped me parse the regulations and give guidance, and the upshot was that a few times a year I got to spend a month or two in an exciting place like Zimbabwe or Kyrgyzstan or Pakistan. But spend enough time doing that work, and you realize that most of these regulations were the result of <a href="https://www.afterbabel.com/cp/166251993">government by train wreck</a> &#8212; that a policymaker reacted after the fact to a news story and instituted a new rule to prevent it from happening again. The proliferation of rules meant a proliferation of tyrannical bureaucrats on the government side who spent their days negotiating with their counterparts on the NGO side about minutiae that have little to no impact on, say, a famine prevention program or local economic growth program. When Graeber&#8217;s essay came out in 2013, it put a finger on a malaise I had long felt, and around a year later I had moved to a new career in tech policy.</p><p>And so, as Elon Musk <a href="https://thehill.com/policy/technology/5122676-usaid-shutdown-elon-musk-doge/">fed USAID &#8220;into the wood chipper&#8221;</a> early this year, my horror at the destruction of what was, dollar-for-dollar, an effective aspect of foreign policy for the U.S., saved lives around the world, and employed many of my friends, was spiked with a dose of recognition that there was a fair amount of bloat . Contracting jobs at USAID itself could have been good candidates for some measure of automation, if purpose-built and well-designed. My mentor at Chemonics told me that if you do something more than twice, you should have a process for it. If you have a process for something, that&#8217;s something that can be automated, which is why Chemonics invested a fortune in creating all sort of internal process maps and flow charts documenting how work gets done in order to get ISO 9001 certified, something more associated with manufacturing than with the delivery of foreign aid.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>This is actually already coming to pass <em>in the tech industry itself</em>! Although coding is one of the LLM use cases that has been most frequently trumpeted by the tech industry as a success, <a href="https://garymarcus.substack.com/p/breaking-news-ai-coding-may-not-be">as Gary Marcus explained a few weeks ago</a>, a new study from benchmarking nonprofit METR looked into the productivity gains for coders using AI who had an average of 4.9 years of experience, and their use of &#8220;current AI tools<strong> </strong>actually slowed down task completion time by 19%.&#8221; </p><p><a href="https://substack.com/@luizajarovsky/note/c-138221794?utm_source=notes-share-action&amp;r=d1poh">Luisa Jarovsky has also recently posted the following</a>:</p><blockquote><p>I haven't found <strong>a single peer-reviewed paper</strong> covering AI-powered productivity increase that also takes into consideration the additional work required to review, correct, and oversee AI outputs.</p><p>The "increased productivity" and "socially beneficial" claims have been at the core of major AI investments and government partnerships, and have been raised to justify legal exceptionalism (especially in fields such as copyright and data protection).</p><p>Yet, it's 2025, and <strong>there doesn't seem to be a single paper </strong>capable of objectively demonstrating what AI companies have been marketing over the past two and a half years (including the review/oversight time, which they conveniently ignore).</p></blockquote></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>Trust and safety probably belongs in all three categories I&#8217;m discussing here, depending on the nature of the company where it sits. These are jobs already being nearly fully automated away at places like Meta, or with low-paid contract workers in places like Kenya reviewing outputs (as has been reported with OpenAI), or having what&#8217;s enjoyable about the jobs being sapped by some level of automation in between. Here I think it&#8217;s important to make clear that I don&#8217;t think these jobs are bullshit, and whether a job is a bullshit one is always in the eye of the holder of the job.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>And given threats to higher education and the nonprofit sector &#8212; home to many bullshit jobs as well &#8212; from the current administration, these jobs are also at risk as well.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>Customer service is much the same, which is why customer service chatbots have so quickly become the norm for many companies. If you work in sales, you might enjoy interacting with customers &#8212; but what happens to your job when your customers automate more of their procurement process? The human element of sales, to the extent it can be separated from the sale itself, is of no intrinsic value to a company.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>If you made it this far and are interested in this topic, I also recommend <a href="https://www.thebignewsletter.com/p/why-are-we-pretending-ai-is-going?r=d1poh&amp;utm_campaign=post&amp;utm_medium=web&amp;showWelcomeOnShare=false">Matt Stoller&#8217;s take over at BIG</a>.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Drinking Everyone’s Milkshakes, Sip by Sip]]></title><description><![CDATA[Understanding the &#8216;Big&#8217; in Big Tech]]></description><link>https://www.tomorrowsmess.com/p/drinking-everyones-milkshakes-sip</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/drinking-everyones-milkshakes-sip</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Wed, 23 Jul 2025 12:21:47 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PjtF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Paul Thomas Anderson&#8217;s <em>There Will Be Blood</em> opens with a solitary prospector, Daniel Plainview (Oscar-winner Daniel Day-Lewis), digging silver from a mineshaft with his bare hands until a broken leg leaves him crawling through the desert dirt. When he later strikes oil, Plainview transforms into an evangelist of progress, limping into small California towns, exhorting the locals to trust him with the stewardship of their futures. "I'm a family man," he tells the suspicious residents of Little Boston, his adopted son H.W. by his side. "I run a family business. This is my son and my partner." He speaks of shared wealth, of oil revenue that will build schools and roads, of lifting the entire community toward a brighter future. "Together we prosper," he declares, and for a moment, you can believe him.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PjtF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PjtF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 424w, https://substackcdn.com/image/fetch/$s_!PjtF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 848w, https://substackcdn.com/image/fetch/$s_!PjtF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!PjtF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PjtF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PjtF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 424w, https://substackcdn.com/image/fetch/$s_!PjtF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 848w, https://substackcdn.com/image/fetch/$s_!PjtF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!PjtF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84933914-d787-4d24-9197-4670b8b6d430_1600x900.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Daniel Day-Lewis in <em>There Will Be Blood</em></figcaption></figure></div><p>By the film's end, Plainview has indeed prospered after many years, but his heart and his rhetoric have long since curdled into something far more sinister. Plainview sits in a bowling alley beneath his mansion, facing Eli Sunday, one of the few remaining links to the communities he once courted. The preacher who once competed with Plainview for moral authority over the town of Little Boston now appears as a supplicant, begging for financial help. Plainview's response reveals the true nature of what he's built. In the film&#8217;s most famous moment, Plainview explains with barely contained fury at Eli&#8217;s na&#239;ve stupidity how underground oil reserves know no property boundaries; how a well on one plot can drain petroleum from miles away without the surface owner ever knowing what's being taken. "I drink your milkshake!" he roars. "I drink it up!&#8221; </p><p>The film draws inspiration from Upton Sinclair&#8217;s 1924 novel <em>Oil!</em> and the story of Standard Oil, whose founder John D. Rockefeller pioneered the business model that Plainview represents: not just extracting natural resources, but systematically capturing the entire infrastructure through which those resources flow to market. Rockefeller didn't just pump oil, but also controlled refineries, pipelines, railroad cars, and distribution networks, creating a system where competing was impossible and choice was illusion.</p><p>The tech giants of today also started small, with appealing promises about connection and shared prosperity. But instead of extracting petroleum from beneath our feet, they've learned to extract value from our thoughts, relationships, and most intimate human experiences, often without us realizing what's being taken or how completely our communities have been consumed by it. Our milkshakes have been drunk by barons at a distance, enriching themselves at our expense, hoping we won&#8217;t notice until it&#8217;s too late.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><p>Like Standard Oil, Amazon, Google, Meta, and Microsoft operate as sophisticated extraction machines, seeking to capture a percentage of all economic activity by taking a cut for themselves, typically &#8211; but not always &#8211; in the form of the harvesting of personal and corporate information. As former Greek finance minister Yanis Varoufakis argues in <em>Technofeudalism</em>, these platforms have created digital fiefdoms where users provide unpaid labor while platform owners extract rent from every interaction.</p><p>Standard Oil sought to dominate petroleum through vertical integration. Both pursued single-industry dominance; brutally so, perhaps, but comprehensible within traditional competitive frameworks. Today's tech giants pursue something qualitatively different: they want to become the permanent intermediaries in virtually every human interaction with the digital and, increasingly, the physical worlds.</p><p>These companies are diverse like Standard Oil too, with investments spanning finance, education, government services, healthcare, retail. Unlike Standard Oil, though, that diversification isn't about operational efficiency or market capture. It's about completeness. The more sectors they penetrate, the more comprehensive their behavioral modeling becomes, the more milkshakes they can take a sip of &#8212; and the more, bigger gulps they can take.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!wt77!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!wt77!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 424w, https://substackcdn.com/image/fetch/$s_!wt77!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 848w, https://substackcdn.com/image/fetch/$s_!wt77!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!wt77!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!wt77!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg" width="1456" height="888" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:888,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!wt77!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 424w, https://substackcdn.com/image/fetch/$s_!wt77!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 848w, https://substackcdn.com/image/fetch/$s_!wt77!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!wt77!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F475d8d8f-9297-4f2e-8ed2-97f3dc67a11f_1600x976.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Udo Keppler, &#8220;Next!&#8221; (1904) &#8212; see that the White House is the only thing not (yet) in the Standard Oil octopus&#8217;s tentacles; arguably, tech companies have achieved what Rockefeller did not.</figcaption></figure></div><p>AI companions and romantic chatbots represent the next frontier in this extraction strategy, targeting intimate relationships &#8212; some of the last human experiences not mediated by these companies. Character.AI exemplifies this approach: venture capitalists explicitly stated that the platform's value lies in capturing intimate conversational data users wouldn't share elsewhere, creating what they call "a magical data feedback loop." According to Andreesen Horowitz, Character.AI users spend an average of two hours daily providing high-quality intimate data that becomes raw material for improving their AI systems.</p><h2><strong>Persuasion Machines and the Infrastructure of Control</strong></h2><p>Beyond companions, AI agents represent tech companies' vision for inserting themselves into all human interactions and transactions. These agents would manage your finances, healthcare, scheduling, travel, and your interaction with government services, making direct human-to-system interaction increasingly obsolete as user interfaces disappear in favor of agent-accessible APIs.</p><p>But imagine a system that knows you're struggling through divorce proceedings, drowning in debt, and feeling profoundly isolated; then imagine that system possessing the persuasive sophistication to guide your decisions about everything from purchases to voting choices. Shoshana Zuboff's concept of "surveillance capitalism" described how these companies trade in "human futures," actively shaping preferences and behaviors to benefit business partners, well before the advent of transformer-powered LLMs. </p><p>Social media algorithms already demonstrate this dynamic by promoting polarizing content that maximizes engagement, contributing to democratic erosion and social fragmentation. But when combined with comprehensive personal data, the persuasive capabilities of large language models create unprecedented power to shape human behavior. AI agents represent the logical endpoint: total mediation of human experience in service of data extraction and behavioral modification. This may explain the aggressiveness with which <a href="https://www.bloodinthemachine.com/p/how-big-tech-is-force-feeding-us">large tech companies are forcing AI into everything</a>.</p><p>This vision of AI agents faces three critical obstacles. First, the quality issue: AI systems powered by large language models remain fundamentally unreliable for high-stakes decisions. (<a href="https://garymarcus.substack.com/p/breaking-news-ai-coding-may-not-be">Even the popular coding use-case seems increasingly problematic</a>.) These systems excel at producing plausible-sounding text but cannot distinguish fact from fiction at scale, lack genuine comprehension of their outputs, and operate without understanding the physical world they're supposedly managing. Why would you trust something that&#8217;s wrong 20% of the time to make travel plans for you? And why would you trust it with a business&#8217;s relationships with customers?</p><p>Second, the accountability gap. It is extremely expensive and time-consuming to hold tech businesses accountable in court; and even if you have the patience and resources, you then must face a general lack of regulation and Section 230, the bad faith weaponization of the First Amendment used to undermine what laws we do have. Because only large corporations are currently developing these massive LLM-powered AI systems, users will likely access AI agents through terms of service that shield tech companies from liability when their systems cause harm. This represents something unprecedented in American commercial law: a form of agency relationship where the principal bears all risk while the agent assumes none. There&#8217;s a reason this is unprecedented: agency only works when there&#8217;s a good balance of trust and accountability between principal and agent.</p><p>Third, consider what we find frustrating when dealing with bureaucracies: getting the run around between different muckety-mucks, your personal circumstances not fitting the lowest common denominator of what can fit on a form, and the esoteric gears of internal process procedure grinding, but not necessarily for your benefit. Technology designed to predict and generate the next plausible text string is unlikely to improve upon these frustrations; it may well make them worse. LLMs can be considered a form of information compression, in which chaotic, messy data is simplified into a pattern, but that simplification leaves out stuff at the margins that might be important or valuable. Whether you would use an AI agent to book travel plans for you or to navigate health insurance benefits, having your use case be shoehorned into what is &#8220;predictable&#8221; is bound to leave out a lot of possibilities. LLMs won&#8217;t truly make you a bespoke itinerary to off-the-beaten path spots in Paris; instead, you might as well be on one of those guided bus tours where the leader has a flag, barks into a microphone while walking backwards, and all the tour-goers have to wear matching hats.</p><p>If AI agents are deployed into the economy without meaningful regulation and following the current strategy of the leading companies, the result will be that most people will find themselves separated from vital resources and services by an impersonal layer of unaccountable technology, amplifying existing problems with customer service bureaucracy while eliminating any possibility of human appeal. United Healthcare's use of AI to automatically deny insurance claims &#8212; currently facing legal challenges &#8212; provides a preview of this dystopian future. When the system denies your claim or makes an error that costs thousands of dollars, you'll find yourself arguing with a chatbot that has no authority to help and no human supervisor you can reach. </p><p>Once you know that scale itself is the aspiration of these companies, it fundamentally changes how we should approach tech policy.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> When I was recently asked in a documentary interview what tech companies fear most, my answer surprised my interviewer: antitrust enforcement, because it directly threatens these companies&#8217; core strategy of achieving omnipresence across economic sectors, and because the penalties can be significant and irreversible.</p><p>The good news is that anti-trust enforcement against tech companies is perhaps one of the only policy continuities between the Trump 1.0, Biden, and Trump 2.0 administrations. But our anti-trust framework, designed for an era when companies sought to dominate single industries rather than achieve economy-wide control, requires some updating. Pundits like Matt Yglesias have argued recently that anti-trust enforcement against companies like Meta and Amazon is misguided because these companies face real competition or offer lower consumer prices. These takes ignore the broader social costs of concentrated economic power, but they are correct in that anti-trust policy, particularly following the George W. Bush administration&#8217;s abandonment of strong remedies against Microsoft (the last time, prior to Google, that the government pursued and won such a case) and the Obama administration&#8217;s lax enforcement over his two terms, is not designed for this problem. </p><p>The transformation of corporations into quasi-governmental entities is not without  precedent. Standard Oil at its peak exercised powers that rivaled those of elected officials: the company set transportation rates that determined which communities would prosper or wither, its private security forces operated with legal impunity across state lines, and its financial influence shaped everything from local newspaper coverage to federal legislation. Rockefeller's empire had become a parallel governance structure that collected tribute from the broader economy while providing services primarily to its own shareholders.</p><h2>&#8220;I&#8217;m finished!&#8221;</h2><p>Daniel Plainview's final words in <em>There Will Be Blood</em> carry a double meaning. Plainly, he's finished with Eli Sunday, having literally beaten him to death in the bowling alley. But more profoundly, he's finished with the work of extraction that has consumed his entire adult life, and that work has finished him as well. His conquest complete, and having drained every drop of value from the landscape and people around him, Plainview sits alone in his mansion, spiritually exhausted by his own ruthless success.</p><p>Standard Oil's monopoly ultimately met a similar fate. After decades of drainage and accumulation, the company grew so large and so predatory that it triggered its own destruction through public backlash and government intervention. Rockefeller's empire was broken apart by antitrust enforcement in 1911, precisely because its extraction had become too visible, too comprehensive, and too damaging to ignore. The corporate shadow state had grown so powerful that it threatened the actual state, forcing democratic institutions to reassert their authority.</p><p>Today's tech giants have learned from Standard Oil's mistakes. They've perfected extraction methods that are largely invisible, making it difficult for people to understand what's being taken or to organize resistance. They've captured the very infrastructure through which opposition might emerge, controlling not just economic systems but the information systems that shape public understanding. And they have continued to diversify into different industries while successfully avoiding meaningful anti-trust penalties (so far).</p><p>But like Plainview's oil reserves, human attention and creativity are ultimately finite resources. The psychological damage from constant surveillance and manipulation, the creative exhaustion from systematic intellectual property theft, the social fragmentation from algorithmic division &#8212; these costs are beginning to surface in ways that even sophisticated narrative control cannot indefinitely suppress.</p><p>The milkshake has been drunk. The question is whether we'll recognize the full scope of this extraction system before it reaches its own violent conclusion, or whether we'll wait until the damage is complete and the extractors themselves are spiritually finished by what they've built. </p><p><em>Quick note: Part 2 to last week&#8217;s post <a href="https://www.tomorrowsmess.com/p/am-i-anti-tech">about my standards for tech in my life</a> will be coming soon! In the meantime, I recommend <a href="https://www.disconnect.blog/p/getting-off-us-tech-a-guide">Paris Marx&#8217;s post on alternatives to big tech products</a>.</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>In addition to anti-trust, it&#8217;s important to consider size as a goal when thinking about regulatory penalties and fines. The UK&#8217;s Online Safety Act, part of which comes into force this week, pegs penalties for non-compliance to a percentage of the offending company&#8217;s revenue. This is a more effective deterrent than fines that are calculated otherwise.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[Am I Anti-Tech?]]></title><description><![CDATA[Or do I just have standards? Part 1.]]></description><link>https://www.tomorrowsmess.com/p/am-i-anti-tech</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/am-i-anti-tech</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 14 Jul 2025 12:55:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!eEnu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When I was about 9 years old, my grandfather, Bill, bought a personal computer, and put it in his &#8220;office,&#8221; which was a closet in his garage. Bill was a lifelong tinkerer: he made his living at that time as a sort of industrial handyman, making sure HVAC systems were in good repair at large facilities. He could repair a full-size Ford truck, build an annex to his house (including doing the electrical work), and fix the small silver radio-controlled Lamborghini I got from Santa one year. Bill spent a lifetime accumulating gadgets and tools, which were overflowing from a storage unit by the time he passed a decade ago. And so it was in that tinkerer spirit he brought home that computer and invited me to tinker with it too.</p><p>The learning curve for a 9 year-old was steep with that computer, it not being a Windows PC or Macintosh with a graphical user interface but run on MS-DOS. I needed his help to boot it up and type in the right command to run the only &#8220;game&#8221; he had: Microsoft Flight Simulator. I&#8217;d be at my grandparents&#8217; house in Mississippi over the summer, spending the days doing cannonballs into their pool and dribbling a basketball around in the dusty lot behind the barn where they&#8217;d put up a hoop for me. I&#8217;d spend the evenings in the garage on Flight Simulator, riveted by the experience of taking off in various pixelated small aircraft from now-defunct Meigs Field in downtown Chicago, doing laps around a pixelated Sears Tower, before trying to land the plane again. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1VNB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1VNB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 424w, https://substackcdn.com/image/fetch/$s_!1VNB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 848w, https://substackcdn.com/image/fetch/$s_!1VNB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 1272w, https://substackcdn.com/image/fetch/$s_!1VNB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1VNB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png" width="640" height="400" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4956f38f-9253-4616-94e7-906fccba3766_640x400.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:400,&quot;width&quot;:640,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Microsoft Flight Simulator (v5.0) (DOS) screenshot: Downtown Chicago&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Microsoft Flight Simulator (v5.0) (DOS) screenshot: Downtown Chicago" title="Microsoft Flight Simulator (v5.0) (DOS) screenshot: Downtown Chicago" srcset="https://substackcdn.com/image/fetch/$s_!1VNB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 424w, https://substackcdn.com/image/fetch/$s_!1VNB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 848w, https://substackcdn.com/image/fetch/$s_!1VNB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 1272w, https://substackcdn.com/image/fetch/$s_!1VNB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4956f38f-9253-4616-94e7-906fccba3766_640x400.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">MS Flight Simulator 5, with a Cessna in downtown Chicago. Source: mobygames.com</figcaption></figure></div><p>When my dad bought a PC for our family a few years later &#8212; this one a sleek, black Acer with just-released Windows 95, a modem, and a CD-ROM that could play cutting edge games like <em><a href="https://en.wikipedia.org/wiki/Myst">Myst</a></em> &#8212; my own tinkering escalated quickly, if only out of what counted as necessity for a 12 year-old. Our family Acer was a lemon; it rarely worked as it was supposed to. The process of installing games I wanted to play, like <em><a href="https://en.wikipedia.org/wiki/Star_Wars:_X-Wing_(video_game)">X-Wing</a>,</em> was fraught, and I spent as much time doing a primitive form of trial-and-error debugging as I installed and uninstalled as I did actually playing anything. This taught me patience and forced me to consider how the machine worked, and when my dad brought home books on <em><a href="https://archive.org/details/htmlfordummies00titt">HTML for Dummies</a></em> and an <a href="https://archive.org/details/qbasicwithintrod0000schn">intro to Visual Basic</a>, I read them cover-to-cover, before coding my own Geocities website and my own turn-based game by the age of 13.</p><div><hr></div><p>I&#8217;ve been called &#8220;tech-lashy&#8221; or &#8220;anti-tech&#8221; by peers. The line of thinking goes that if those of us who are working towards a better relationship between humans and the tech that&#8217;s being built today are consistently and uniformly <em>too</em> negative about tech companies and their products, that will be a turn-off both for the general public as well for potential allies (or, more importantly perhaps, donors) within the tech industry. </p><p>Such concerns most often come from technologists themselves; these folks might say that people like me just don&#8217;t understand because I don&#8217;t code. For someone like me with a degree in the humanities and whose profession is policy and politics, I couldn&#8217;t possibly comprehend how technology works, much less the limitations of what builders are dealing with, with the implication that expectations are too high and those of us with limited comprehension should just trust that the people on the inside are doing their best. One example is Casey Newton&#8217;s Platformer post from December, &#8220;<a href="https://www.platformer.news/ai-skeptics-gary-marcus-curve-conference/">The phony comforts of AI skepticism</a>,&#8221; in which he argues that it tends to be outsiders who don&#8217;t understand who believe &#8220;AI is fake and sucks.&#8221;</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><p>I began this post with my own backstory because, even though I don&#8217;t know Python, I&#8217;m no tech troglodyte. Regardless of whether I think AI is fake (I do believe <a href="https://www.tomorrowsmess.com/p/a-field-guide-to-agi-hype">AGI hype is ultimately fake</a>), I do think what we have sucks. But that&#8217;s not because I don&#8217;t understand; it&#8217;s because I&#8217;m experienced and knowledgeable enough to have <em>standards</em>. &#8220;The soft bigotry of low expectations,&#8221; a racially fraught phrase that entered the lexicon thanks to George W. Bush, comes to mind here as a way of explaining how we&#8217;ve come to accept that most digital products don&#8217;t work as they are promised to work. We&#8217;re doing ourselves a disservice the more we permit tech executives to escape dealing with the inadequacies in their products by simply gesturing at what might be possible in the future (if we just let them do whatever they want along the way and don&#8217;t regulate them, of course).</p><p>Consider, for a moment, the humble washing machine and dishwasher. When I was a Peace Corps Volunteer, I didn&#8217;t have hot water in my village house, much less any appliances, and so doing my laundry took all day and was physically exhausting. I first had to fetch enough water to do a load of clothes with &#8212; which is more water than is really a fun time to carry around &#8212; before using an actual washboard to scrub them. This was not a light scrub, mind you: you really had to work them on the washboard if you wanted to get them clean. (The babushkas in the village tended to have impressive Popeye forearms, I assume for this reason.) To wash the dishes, I had to first fetch water, then boil it, then had to do everything by hand. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7jfQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7jfQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 424w, https://substackcdn.com/image/fetch/$s_!7jfQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 848w, https://substackcdn.com/image/fetch/$s_!7jfQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!7jfQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7jfQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg" width="816" height="963" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:963,&quot;width&quot;:816,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:212643,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/167452769?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!7jfQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 424w, https://substackcdn.com/image/fetch/$s_!7jfQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 848w, https://substackcdn.com/image/fetch/$s_!7jfQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!7jfQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e5532d4-3ad4-4c3f-976e-60ed7ddfc1fe_816x963.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">My Peace Corps laundry situation</figcaption></figure></div><p>When was the last time you used a washing machine or dishwasher that was <em>unreliable</em>? That had to be patched or updated? That was glitchy<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>? That needed to be replaced before it was <a href="https://www.consumerreports.org/appliances/dishwashers/how-to-make-your-dishwasher-last-longer-a7937722178/">10 years old</a>? Probably rarely, if ever.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> Washing machines and dishwashers tend not to be glitchy because the profit model of the companies that design and manufacture them is different from what has been the dominant paradigm in the tech industry for more than two decades and which has given us the &#8220;<a href="https://doctorow.medium.com/my-mcluhan-lecture-on-enshittification-ea343342b9bc">enshittification</a>&#8221; of digital tech that we are forced to live with today. (Not to mention that the economic productivity gains of these appliances, which freed women from these sorts of household tasks and allowed them to enter the workforce, likely far outstrips the productivity gains provided by digital advertising and AI.)</p><div><hr></div><p>Getting a washing machine or a dishwasher is not like adopting a Tamagotchi that has to be cared for incessantly &#8212; they aren&#8217;t technologies that ask for much or take anything from you, unlike, say, your average Microsoft product. Instead, these technologies give something to you; convenience, time, and a result that&#8217;s probably better and more sanitary than what you&#8217;d do yourself. We tend to exclude these sorts of machines from our conception of what we call &#8220;tech&#8221; these days, but the goal of a word processor or search engine or networking tool should be the same as a washing machine in the abstract: to more efficiently use human labor, by allocating as much of the necessary but low value-add labor to a machine that can actually do the job better than a human.</p><p>In that vision, Bosch and Whirlpool and LG and other home appliance manufacturers set out to design and make a machine that functions as it should <em>all the time</em>, and the companies compete with each other on fulfilling that promise of quality and on providing value to the customer. Importantly, if these companies make something that doesn&#8217;t live up to expectations or breaks, they are accountable both to customers and the market for that failure. </p><p>Yet none of these things are true of Google, Meta, Microsoft, or Amazon: these companies are so big that they are inescapable; realistic alternatives to their products don&#8217;t really exist because they&#8217;ve fully captured their markets; Section 230, the failure of Congress to pass any meaningful laws, and a weaponization of constitutional law deployed against state legislation have meant these companies are nearly impossible to sue; and most importantly, their profit model is built around collecting rents and extracting value from the customer, not delivering value.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!eEnu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!eEnu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 424w, https://substackcdn.com/image/fetch/$s_!eEnu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 848w, https://substackcdn.com/image/fetch/$s_!eEnu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!eEnu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!eEnu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg" width="725" height="301" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:301,&quot;width&quot;:725,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!eEnu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 424w, https://substackcdn.com/image/fetch/$s_!eEnu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 848w, https://substackcdn.com/image/fetch/$s_!eEnu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!eEnu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F044ca2e1-420a-4dca-96e9-12e204c3c85e_725x301.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Jonathan Pryce as Bond villain Elliott Carver from the 1997 installment <em>Tomorrow Never Dies</em>; part of Carver&#8217;s wealth was derived from selling software &#8220;so full of bugs . . . users will be forced to upgrade for years.&#8221;</figcaption></figure></div><p>So I am not anti-technology; rather, I am <em>for</em> technology that does a few things:</p><ol><li><p>The technology should give <em>exponentially</em> more than it takes. Having to fiddle with something to get it to work through trial and error is one thing. Tinkering with early PCs taught me something, and generally learning to take care of your things is part of becoming an adult, so I&#8217;m not expecting zero work on my part here. But when Google&#8217;s search results are full of &#8220;One Weird Trick&#8221; ads and AI slop and it also is harvesting my data, their product is not satisfying this requirement.</p></li><li><p>Relatedly, the company making the technology should profit when it maximizes the product&#8217;s giving back to the human, rather than the inverse of profiting when it extracts or manipulates. A market that doesn&#8217;t create this incentive this is not allocating capital efficiently towards technologies that improve quality of life and productivity.</p></li><li><p>The productivity gains from the technology should be clear over the long-run, and not just substitute a different kind of work for the work it&#8217;s replacing. We are all our own travel agents and secretaries now, and we are also all busier than ever. That&#8217;s not a coincidence.</p></li><li><p>When maintained properly, tech should work 99% of the time as warranted, and the builder of the tech should make it right if not. We expect this of ATMs and airplane auto-pilot systems, and it&#8217;s totally reasonable to expect it of consumer and business digital tech. This includes not just a lack of glitches, but also fulfilling promises. To illustrate: my washing machine always cleans my clothes, and there&#8217;s a warranty if it doesn&#8217;t; evidence is mounting that Chromebooks in classrooms hinder educational achievement, for example, but with zero accountability for Google on the horizon for that failure. Interestingly, tech companies in start-up mode often make a big show of making it right if something went wrong &#8212; think of how good Uber or Amazon&#8217;s customer service was 15 years ago compared to today.</p></li></ol><p>In part II, I&#8217;ll apply these rules to some existing products, share how I use generative AI, and discuss the policies that would incentivize more tech like washing machines rather than extractive and buggy tech we get.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>As a remote worker &#8212; which is in many ways wonderful and in some ways not possible without the products I malign &#8212; I probably deal with at least one glitch a day, as I&#8217;m sure many of you do, if not more. Bluetooth headphones don&#8217;t connect. App needs an update and needs to reboot. Zoom keeps making me turn off the AI assistant. I could go on, and that doesn&#8217;t even cover the glitches that are there but we don&#8217;t notice, like when Meta&#8217;s algorithms flag and take down ads for women&#8217;s health products on Facebook because they are &#8220;sexual.&#8221; But I remember when I got my first bank card roughly 25 years ago being told to count the money you get out of the ATM in view of the security camera before you leave in case you get short-changed, and I can remember one time in my entire life that I&#8217;ve gotten the wrong amount of money &#8212; and it was because the ATM ran out of cash, not because of a glitch. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>I recently had to purchase both of these appliances &#8212; closer to 15 years old than 10 &#8212; and I discovered it&#8217;s surprisingly hard to find a dishwasher or washing machine that doesn&#8217;t have WiFi in it, which obviously complicates the machine and makes it more likely to be buggy.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Anticipatory Deregulation]]></title><description><![CDATA[How digital bandits evolve into digital tyrants]]></description><link>https://www.tomorrowsmess.com/p/anticipatory-deregulation</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/anticipatory-deregulation</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Wed, 02 Jul 2025 13:58:21 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/ec7edd96-f45d-48b6-b7c9-5c7c12c4aaf1_474x266.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Yesterday morning at around 4 a.m. in Washington, <a href="https://apnews.com/article/congress-ai-provision-moratorium-states-20beeeb6967057be5fe64678f72f6ab0">the Senate voted 99-1</a> to strip the state artificial intelligence law moratorium language <a href="https://www.tomorrowsmess.com/p/the-ai-moratorium-gambit">I wrote about in May</a> from the federal budget bill. Just hours earlier, it had seemed as if the provision were to pass, as Senator Ted Cruz and Senator Marsha Blackburn had reached a compromise that appeared to have assuaged Sen. Blackburn&#8217;s concerns on the bill. Some of us were also concerned that Sen. Chuck Schumer was not prioritizing the issue and that some Democrats would also defect. Ultimately, however, the provision failed, and for the time being, states can continue to lead the way on legislating around not just AI, but also manipulative algorithms and kids online safety. (Let me subtly emphasize &#8220;for the time being,&#8221; because it seems overwhelmingly likely to me this proposal will come back in some form.)</p><p>Although Sen. Cruz&#8217;s professed motivation for the moratorium is to stop &#8220;woke&#8221; state AI laws, the argument in support of the moratorium that policymakers tend to find the most compelling &#8212; even among some Democrats &#8212; is the argument that America cannot afford to constrain innovation, because otherwise we might lose this technological struggle over AI to China. </p><p><a href="https://www.tomorrowsmess.com/p/innovation-and-myth">As I&#8217;ve written previously</a>, tech leaders like Marc Andreessen have used the rhetoric of American competitiveness and innovation to deflect regulation while actually investing in products that harm children and society, representing a fundamental corruption of what innovation should mean for national progress. But even well-informed policymakers who do see that that rhetoric is self-serving and hollow &#8212; and more of them exist than you might appreciate, on both sides of the aisle &#8212; still agree with the premise that taking the restraints off tech companies is key to maintaining the U.S. global dominance in tech. </p><p>Why? </p><p><em>It&#8217;s because they believe it&#8217;s worked before, and so that means it will work again.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2><strong>&#8220;Inventing&#8221; the Internet</strong></h2><p>I was only 17 during the 2000 U.S. election, and so was unable to vote in it; I also was a poorly informed teenager. But I don&#8217;t think I&#8217;m alone when there&#8217;s really only three things I recall about Al Gore&#8217;s candidacy: the court challenge and the hanging chads in the Florida recount, SNL&#8217;s parodies of both Gore and Bush, and Gore being mocked for having claimed to have &#8220;invented the Internet.&#8221; </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gbE4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gbE4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 424w, https://substackcdn.com/image/fetch/$s_!gbE4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 848w, https://substackcdn.com/image/fetch/$s_!gbE4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 1272w, https://substackcdn.com/image/fetch/$s_!gbE4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gbE4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic" width="474" height="266" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:266,&quot;width&quot;:474,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:18011,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/167347542?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gbE4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 424w, https://substackcdn.com/image/fetch/$s_!gbE4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 848w, https://substackcdn.com/image/fetch/$s_!gbE4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 1272w, https://substackcdn.com/image/fetch/$s_!gbE4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6fb16710-a63e-4f1f-9c4f-e3a4815ef7a3_474x266.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Al Gore and George W. Bush during the 2000 campaign; source: cnn.com</figcaption></figure></div><p>Turns out, <a href="https://www.snopes.com/fact-check/internet-of-lies/">Gore never said anything like that</a>, but did take credit in an interview for leading on legislation that enabled the commercialization of the digital world. Gore had said as far back as 1989 that "the nation which most completely assimilates high-performance computing into its economy will very likely emerge as the dominant intellectual, economic, and technological force in the next century.&#8221; That position reflected a bipartisan consensus that would fundamentally reshape global power dynamics. By the mid-1990s, Congress and the Clinton administration, working together, privatized the internet, deregulated the telecommunications sector, and created legal frameworks that allowed American tech companies to operate with minimal oversight (the infamous Section 230). In so doing, U.S. policymakers essentially unleashed Silicon Valley on the world, betting that American entrepreneurial energy would capture global markets before other nations could establish competitive digital economies.</p><p>Credit where credit is due: this strategy proved remarkably successful on its own terms for nearly two decades. The United States didn't merely dominate the early internet; it fundamentally shaped its basic infrastructure and cultural DNA in ways that persist today. Consider the most mundane yet revealing example: web addresses worldwide use the Latin alphabet rather than Cyrillic, Arabic script, or Mandarin characters. This seemingly technical detail forced the entire global population to learn English to participate meaningfully in the digital economy. American platforms like Google, Facebook, Microsoft, and Amazon colonized digital space itself, establishing the protocols, standards, infrastructure, and cultural norms that would govern online interaction for billions of people. Thus, the bipartisan strategy of <em>deregulating-in-advance<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></em> delivered exactly what Gore and Clinton had promised: comprehensive American technological hegemony that extended far beyond market share to encompass the very architecture of global communication.</p><p>Today's tech executives and investors, by invoking the specter of Chinese technological competition to appeal to lawmakers' geopolitical instincts, hope to continue the party into the 2030&#8217;s. When Peter Thiel or Marc Andreessen warns that legislating on AI will hand technological leadership to America&#8217;s adversaries, they're appealing to something that many policymakers who remember and understand what Congress and the Clinton administration did in 1990s remain proud of, no matter how far from serving American interests the tech sector has gotten. Silicon Valley has learned to weaponize American anxiety about losing technological supremacy, and in so doing has transformed legitimate geopolitical concerns into a convenient shield against domestic accountability.</p><h2><strong>From Innovators to Digital Bandits</strong></h2><p>Imagine life in a prehistoric settlement in Mesopotamia, thousands of years ago. Hunter-gatherers have begun to become more sedentary, as their dependence on grains that grow along the rivers as a dietary staple has increased. As these nascent farming communities begin to stockpile excess grain, other groups discover that this excess is easy to steal or take by force. Raiders thus sweep through sporadically, stealing grain stores and livestock before disappearing. The settlers would live in constant fear, never knowing when the next attack might come. </p><div class="image-gallery-embed" data-attrs="{&quot;gallery&quot;:{&quot;images&quot;:[{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/434c7830-a00e-4ccf-8dc0-85e6c6c1948c_500x375.jpeg&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/95735be2-5d16-41bc-b6d2-ab3e38ab885d_482x401.png&quot;}],&quot;caption&quot;:&quot;Cereal field and ancient seal depicting a laborer with a plow from Mesopotamia; Source: Wikipedia&quot;,&quot;alt&quot;:&quot;&quot;,&quot;staticGalleryImage&quot;:{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/36795d11-1799-40c5-b30e-0db0edfb8ea4_1456x720.png&quot;}},&quot;isEditorNode&quot;:true}"></div><p>Then eventually, a particularly cunning group of raiders arrives with a different proposition. Instead of pillaging, they would stay, and protect the settlement from other raiders; but in exchange, they would take a share of every harvest, collect taxes on the village market, and require young men to serve at their command. </p><p>Over time, these stationary bandits evolved into what we recognize as governments: they built roads, maintained granaries, established laws, and provided some public goods. But the fundamental relationship remained extractive; there was always a substantial gap between the taxes collected and the services provided, with that surplus funding the rulers' magnificent lifestyles. This theorized transition from "roving bandit" to "stationary bandit," as economist Mancur Olson termed it, is a compelling idea for how the first governments came into being, which were by their nature predatory.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a></p><p>In pushing for the proposed regulatory moratorium, today&#8217;s tech giants offer a similar deal: &#8220;We will protect the U.S. from foreign technology in exchange for unquestioned digital sovereignty.&#8221; And, as it turns out, the deal is predatory and ultimately extractive, just as it was for those theoretical prehistoric settlements. </p><p>Among the first people to fully grasp this new form of extraction have been artists and writers, whose economic livelihoods depend directly on the intellectual property that tech companies have systematically appropriated through AI training. When Meta trains its systems on millions of pirated books without compensating authors, or when image generators scrape artwork without permission, the theft is concrete and measurable. These creators can see their grain being taken from their stores in real time.</p><p>Unlike governments, however, tech companies face no electoral accountability for their governance decisions. We cannot vote them out when their algorithmic systems promote division, exploit children, or undermine democratic institutions. And while even prehistoric farmers who could at least see how much of their production disappeared into their local warlord&#8217;s coffers, we remain largely blind to the scope and value of what these digital sovereigns extract from our daily lives.</p><h2><strong>Bandits, Enshittification, and the Limits of Corporate Patriotism</strong></h2><p>The flaw in applying the 1990s playbook to AI lies in assuming that American tech companies remain fundamentally aligned with American interests. This assumption has proven catastrophically naive as these corporations have demonstrated neither patriotism nor respect for democratic sovereignty when their business interests are at stake. Meta's willingness to hand over Hong Kong democracy activists' data to the Chinese Communist Party (covered in the hearing I wrote about <a href="https://www.tomorrowsmess.com/p/montessori-toys-and-the-business">here</a>) reveals how quickly principles of democratic solidarity evaporate when confronted with market access opportunities. When Mark Zuckerberg bans his own children from using Meta's products while defending those same products as essential for American competitiveness, we witness the contempt these executives actually hold for the public they claim to serve.</p><p>The enshittification cycle that Cory Doctorow <a href="https://doctorow.medium.com/my-mcluhan-lecture-on-enshittification-ea343342b9bc">has described</a> illustrates exactly how this dynamic unfolds: platforms begin by serving users to build market share, then abuse users to benefit business customers, then abuse everyone to extract maximum value for shareholders. This pattern tells us precisely how AI will develop if we repeat the 1990s approach. Initially, AI companies will offer genuinely useful services to build user dependence, much as early social media platforms facilitated authentic human connection. Once users and business customers are locked into these systems, the companies will begin extracting value through increasingly manipulative and harmful practices. Finally, they will abuse everyone in the ecosystem to maximize shareholder returns, leaving behind degraded products that users hate but cannot abandon due to switching costs and network effects. This is the model of the stationary bandit adapted to the digital world.</p><h2><strong>A Different Kind of Tyranny</strong></h2><p>This reveals something that should particularly concern conservatives who traditionally worry about concentrated power and governmental overreach: we are witnessing the emergence of private tyrannies that possess governmental scale and influence without any of the democratic constraints that supposedly limit state power. These digital stationary bandits have grown large enough to compete directly with the governments of advanced economies, but they operate without constitutional limitations, electoral accountability, or meaningful transparency requirements.</p><p>The companies seeking anticipatory deregulation have already demonstrated their willingness to exploit technology to extract maximum value from human vulnerability, regardless of the consequences for democratic institutions, child welfare, or social cohesion. Unlike traditional governments that at least theoretically derive their legitimacy from popular consent, our digital sovereigns answer only to shareholders and venture capitalists who explicitly celebrate their ability to monetize human vulnerabilities.</p><p>The 1990s <em>deregulation-in-advance</em> strategy succeeded because it aligned private profit motives with national strategic objectives during a unique historical moment when the internet was a blank canvas. But today's tech giants have already captured the internet and demonstrated their fundamental hostility to democratic governance, user welfare, and basic honesty. Trusting these same actors to responsibly develop artificial intelligence based on their promises to maintain American technological leadership represents a profound category error that threatens to create tools of oppression more sophisticated than any government has ever possessed.</p><p>Lawmakers got to the right answer this week, though <a href="https://x.com/Bannons_WarRoom/status/1940161337625108492">possibly for the wrong reasons</a>. Let&#8217;s hope that when &#8212; not if &#8212; they are presented with the same decision again soon, they make the same choice.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>This is an intentional riff on historian Timothy Snyder&#8217;s writing on &#8220;<a href="https://lithub.com/resist-authoritarianism-by-refusing-to-obey-in-advance/">obeying in advance</a>&#8221; (a.k.a. anticipatory obedience) being a feature of authoritarian regimes.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>Matt Yglesias also has written on this theory and applied it to contemporary politics, most recently arguing that the progressive state<a href="https://www.slowboring.com/p/the-stationary-bandits-of-new-york"> in places like New York</a> constitutes a modern form of the stationary bandit. </p></div></div>]]></content:encoded></item><item><title><![CDATA[On Abundance and a World of Silicon Plenty]]></title><description><![CDATA[Just gesturing at a hopeful future isn&#8217;t enough to make it so]]></description><link>https://www.tomorrowsmess.com/p/on-a-world-of-silicon-plenty</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/on-a-world-of-silicon-plenty</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Tue, 10 Jun 2025 17:41:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!W_Ga!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Close your eyes and imagine a hopeful vision of life just a couple of decades from now: as much leisure time as you want thanks to the use of advanced computers that make the economy hum efficiently and a fridge full of food grown minutes away on technologically advanced farms. </p><p>This is what Ezra Klein and Derek Thompson ask readers to see in the opening of their book, <em>Abundance</em>, which has been having a moment. Klein and Thompson argue that we could have a world of abundance rather than one of scarcity, but we&#8217;ve simply chosen scarcity. </p><p>To wit:</p><blockquote><p>Seen from that future time, when every commodity the human mind could imagine would flow from the industrial horn of plenty in dizzy abundance, this would seem a scanty, shoddy, cramped moment indeed, choked with shadows, redeemed only by what it caused to be created.<br><br>Seen from plenty, now would be hard to imagine. It would seem not quite real, an absurd time when, for no apparent reason, human beings went without things easily within the power of humanity to supply and lives did not flower as it was obvious they could.</p></blockquote><p>The above is actually <em>not</em> a quote from <em>Abundance, </em>but rather Francis Spufford&#8217;s 2010 book <em><a href="https://bookshop.org/p/books/red-plenty-francis-spufford/8229330">Red Plenty</a></em>, which is a brilliant work about the hopes of the Soviet Union in the 1950s. <em>Red Plenty </em>is about the era under Nikita Khrushchev, when the Soviet Union seemed poised to overtake the United States: they had put Sputnik into orbit and caught up with the U.S. on thermonuclear weapons; the USSR&#8217;s economic growth more than doubled that of the U.S., and their &#8220;planned economy&#8221; was on pace to overtake the capitalist West; their cars, food and houses would soon be better, and there would be more money and leisure all around, thanks to a top-down, start-to-finish management that &#8220;could be directed, as capitalism could not, to the fastest, most lavish fulfillment of human needs.&#8221;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W_Ga!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W_Ga!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 424w, https://substackcdn.com/image/fetch/$s_!W_Ga!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 848w, https://substackcdn.com/image/fetch/$s_!W_Ga!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!W_Ga!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W_Ga!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg" width="1200" height="792" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:792,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!W_Ga!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 424w, https://substackcdn.com/image/fetch/$s_!W_Ga!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 848w, https://substackcdn.com/image/fetch/$s_!W_Ga!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!W_Ga!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb2751daa-eed4-4be6-b384-b001ac32bac1_1200x792.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">&#1054;&#1073;&#1097;&#1077;&#1075;&#1086;&#1089;&#1091;&#1076;&#1072;&#1088;&#1089;&#1090;&#1074;&#1077;&#1085;&#1085;&#1072;&#1103; &#1072;&#1074;&#1090;&#1086;&#1084;&#1072;&#1090;&#1080;&#1079;&#1080;&#1088;&#1086;&#1074;&#1072;&#1085;&#1085;&#1072;&#1103; &#1089;&#1080;&#1089;&#1090;&#1077;&#1084;&#1072; &#1091;&#1095;&#1105;&#1090;&#1072; &#1080; &#1086;&#1073;&#1088;&#1072;&#1073;&#1086;&#1090;&#1082;&#1080; &#1080;&#1085;&#1092;&#1086;&#1088;&#1084;&#1072;&#1094;&#1080;&#1080;, the Soviet "National Automated System for Computation and Information Processing.&#8221; I believe Viktor Glushkov, the founding father of information technology and cybernetics in the USSR, is seated at the right.</figcaption></figure></div><p>And all of this plenty would be powered by advances in technology, leveraging algorithms to find efficiencies that would have the nation&#8217;s factories and fields gushing forth with an abundance of good things.</p><p>If you&#8217;ve read <em>Abundance</em>,<em> </em>read commentary on it, or heard any of the dozens of podcasts about it, this will all sound too familiar.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><h3>What&#8217;s this &#8216;abundance&#8217; of fuss all about?</h3><p>Technological innovation is core to the vision in <em>Abundance, </em>with technology serving as a <em>deus ex machina</em> that will make a future of plenty possible &#8212; it&#8217;s just waiting to be unleashed, but it&#8217;s held back by progressive governance. In Klein and Thompson&#8217;s view, left-leaning interest groups like environmental activists and unions have become their own worst enemy in achieving progressive goals. The book contends that well-intentioned governance (typically the work of the political left) has created a paralyzing web of regulations, environmental reviews, and procedural requirements that ultimately prevent the construction of housing, clean energy infrastructure, and more. Meanwhile, conservatives have remained too skeptical of the necessity of government involvement in research and development. Both trends result in a lack of innovation that improves healthcare outcomes or quality of life. Their solution involves embracing "supply-side progressivism" that prioritizes building and innovation over regulatory perfectionism, supporting the private sector with government rather than constraining it.</p><p>The book has captured the chattering class&#8217;s attention because it arrived at a moment of Democratic soul-searching following Trump's reelection. <em>Abundance</em> offers Democrats a way to critique their own governance failures without ceding ground to Republicans on core values; it also offers a way for the Democrats to articulate a hopeful vision of the future rather than simply be the party that has of late not been much more than a clearinghouse for diverse interests who are &#8216;against&#8217; Trump and Republicans. The idea has particular resonance in California, where as a former resident Klein witnessed firsthand the contradictions of a progressive state unable to build housing or complete infrastructure projects efficiently, and where he was steeped in a culture of <a href="https://www.tomorrowsmess.com/i/159020532/technology-is-humanitys-salvation">techno-optimism</a>, <a href="https://www.tomorrowsmess.com/p/innovation-and-myth">Silicon Valley mythmaking around innovation</a>, and popular corporate mantras around encouraging an &#8220;abundance mindset&#8221; at work. The book's timing also coincides with continued frustration among younger Democrats facing astronomical housing costs and the abandonment of the party by its former allies in tech, making its optimistic vision of technological and policy solutions especially appealing to those groups. </p><p><em>Abundance</em> focuses on regulatory barriers rather than corporate power and market failures. Klein and Thompson don&#8217;t address how how utilities block renewable energy development, for example, or how relying on venture capital investment to produce innovations for public good rather than exploitation has not really gotten the outcomes we want for the last two decades. (Though Klein posted <a href="https://www.nytimes.com/2025/06/08/opinion/abundance-democrats-future.html">a somewhat defensive op-ed</a> partially addressing some of those points this week). And we've already seen what happens when government provides unconditional support to private actors: it creates figures like Elon Musk, who leveraged public subsidies into personal wealth and political power, or wasteful sideshows like Amazon&#8217;s HQ2 contest.</p><p>Klein and Thompson's perspective reflects that they are commentariat creatures of the technocratic center-left. The authors appear to be wishing into existence a world   where technocrats have expertise that the general public trusts. This world no longer exists, if it ever did. The U.S. cultural baseline is already anti-intellectual and anti-bureaucracy compared to, say, France, Germany, or even the UK or Australia, while the COVID pandemic accelerated public skepticism of institutional competence and elite expertise &#8212; a skepticism which, by the way, is something <a href="https://www.tomorrowsmess.com/i/159020532/men-should-be-judged-by-their-creations-not-their-credentials">core to the culture of  Silicon Valley</a>.<br><br>And while union resistance and environmental activism certainly complicate policymaking, it's not at all clear whether process imposed by satisfying multiple stakeholders is always the cause of delays rather than actually a symptom of a lack of political will. Sometimes a project mired in cycles of permitting is indeed obstructed by short-sighted leftist groups, but sometimes those cycles churn as a way of getting broader buy-in where it's otherwise lacking. There's a reason why high-profile luxury housing and corporate capital investment tends to move forward without stalling in permitting stages: the investors in those projects use the political levers available to them to advance their projects with powerful political figures as champions. </p><h3>Not an abundance of answers to hard questions</h3><p>I generally believe the theory that candidates and policymakers from either side of the aisle tend to be more successful when they offer a vision of the future &#8212; this is the one thing that both Obama&#8217;s 2008 campaign and MAGA have shared, and could explain Obama/Trump/Biden/Trump voter &#8212; but the devil is in the details. Klein and Thompson offer no thoughts on how the world of <em>Abundance</em> can be brought about with an electorate that&#8217;s deeply skeptical of expertise and while both the dominant political players in Washington and dominant culture among technologists is hostile to technocrats. <br><br>But they also don&#8217;t offer anything beyond an idea for others to do the political work around. The regulations and processes that <em>Abundance</em> agenda advocates dismiss as mere red tape often embody hard-won compromises about environmental justice, worker protections, and democratic participation. Streamlining these away may indeed enable more construction, but it risks recreating the very inequalities and harms that progressive governance originally sought to address and alienating the groups on whose popular support policymakers pushing an abundance agenda will depend. Without a deeper societal trust in technocratic expertise, popular support is all the more important. And if you try to imagine an abundance society <em>without</em> popular support, it will look a lot more like a society with both the dark past and the hopeful future of <em>Red Plenty </em>&#8212; fully automated luxury communism, imposed as the happy ending to a period of authoritarianism &#8212; which I doubt either a majority of American conservatives or progressives would get behind.  </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!VN87!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!VN87!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 424w, https://substackcdn.com/image/fetch/$s_!VN87!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 848w, https://substackcdn.com/image/fetch/$s_!VN87!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 1272w, https://substackcdn.com/image/fetch/$s_!VN87!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!VN87!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png" width="1191" height="902" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:902,&quot;width&quot;:1191,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1396531,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/165629098?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!VN87!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 424w, https://substackcdn.com/image/fetch/$s_!VN87!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 848w, https://substackcdn.com/image/fetch/$s_!VN87!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 1272w, https://substackcdn.com/image/fetch/$s_!VN87!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd1adbf4-4539-4a44-a264-c18ab5fb5599_1191x902.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This is where the comparison to <em>Red Plenty</em> begins to be a deeply a useful one. The protagonists of <em>Red Plenty </em>&#8212; economists, scientists, computer programmers, industrialists, artists and politicians &#8212; all shared the utopian dream of a post-capitalist society in people would be equals and no one would want for anything. There was no debate about what they are trying to achieve; that had already been decided for them by the October Revolution and the Stalinist era that came before. </p><h3>Corporate power in a land of Silicon Plenty</h3><p>What is our national project, in the U.S., in 2025? Herein lies the problem. The two major parties certainly don&#8217;t agree on much, but they also are rife with internal disagreement on this question. It&#8217;s at best a minority view among Republicans that the bounties of innovation and our natural world should be broadly shared.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> And the debate on the left over <em>Abundance </em>appears to be driven by the subtextual argument that the Democratic coalition should abandon some of its more obstructionist members like unions, environmental activists, and those who are skeptical of corporate power if they can&#8217;t get on board with delivering an abundant future; those groups will certainly fight hard against that possibility because the Democratic coalition is the only viable pathway to political power they have in the near-term.</p><p>But even if right and left can rally around the same version of American utopia, then they will have to deal with the harder challenge of shaping it from the clay of the world we have &#8212; and that&#8217;s where Klein and Thompson&#8217;s techno-optimism and lack of detail fails them, and where <em>Red Plenty</em> offers its best lessons. <br><br>By the end of <em>Red Plenty,</em> utopian dreams had decomposed into the realities of the Soviet system. Even though the algorithmic innovation that could make it all work in theory was always just around the corner &#8212; perhaps in want of one of the West&#8217;s fancy new computer chips &#8212; the problem of getting algorithms to fully account for the complex, fluid open systems of the real world remained unsolved.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> Getting accurate demand signals for an economy where goods were produced and distributed without regard for profit margins was impossible. Lacking accurate data<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a>, political leadership in Moscow increased production targets even as equipment became more and more outdated. Pay cuts and scarce commodities led to riots. Ultimately the USSR had to borrow money from the West and import Western grain to feed its population, and still couldn&#8217;t keep up with American consumer technology.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hYMM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hYMM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 424w, https://substackcdn.com/image/fetch/$s_!hYMM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 848w, https://substackcdn.com/image/fetch/$s_!hYMM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!hYMM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hYMM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg" width="468" height="605.5677233429395" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:898,&quot;width&quot;:694,&quot;resizeWidth&quot;:468,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hYMM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 424w, https://substackcdn.com/image/fetch/$s_!hYMM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 848w, https://substackcdn.com/image/fetch/$s_!hYMM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!hYMM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2940530c-059c-470e-9dce-bfbf7e00b373_694x898.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Not quite Red Plenty: A 1980s Soviet washing machine, the Vyatka &#8212; which was the first Soviet machine that could truly run automatically, but was so expensive no one could afford it, used too much electricity, and couldn&#8217;t even do all the bed linens at once</figcaption></figure></div><p>Ultimately, the questions <em>Abundance </em>raises are familiar ones of <em>who should have power</em>, and <em>who already has too much</em>. Klein and Thompson see Democratic Party-aligned interest groups as having too much power over our collective future, and some of these groups certainly do create problems out of short-sightedness and self-preservation. But it&#8217;s telling that the fiercest debate on the left over <em>Abundance</em> right now is over corporate power and anti-trust enforcement, with Klein and figures like <a href="https://www.slowboring.com/p/an-abundance-theory-of-power">Matt Yglesias arguing</a> that progressives who call out that big companies often behave badly when given free reign are also part of the problem. </p><p>Yglesias has an outdated conception of the problem that corporate power and monopolization creates: that corporate power is a problem when companies use that power to artificially raise prices to a point that&#8217;s a detriment to consumers. Meanwhile, Klein admits corporate power *can* be a problem but believes that Democratic messaging on it is a distraction. </p><p>Klein is correct that the anti-corporate left is unfocused (though he doesn&#8217;t say what they should focus on), and Yglesias is correct that competition law was originally meant to address vertically-integrated businesses with monopolies on pricing like Standard Oil, not a business that doesn&#8217;t charge its users anything like Meta (and as a consequence he fails to see that corporate size can create other problems beyond just prices). Both miss the bigger point that we face a choice on who we trust to wield power in bringing a utopia of abundance into being &#8212; technocrats, activists and interest groups, or corporations. Klein and Yglesias are seemingly both choosing to vest that power in the technocrats, even though there is currently no viable political pathway to that, and even though that path will likely end up further empowering the corporations.<br><br>We do live in a world of finite resources, and so long as a powerful, unaccountable few are directing the distribution of those resources in such a way that does not serve the goal of everyone existing in a world of plenty, we will have scarcity. That&#8217;s a thesis from <em>Abundance </em>that I can agree with.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> But failing to account for the fact that so much of our lives &#8212; what we buy, how we buy it, why we buy it, what culture we consume, what we think about politics, and even the future of our economic livelihoods &#8212; is shaped by a handful of unaccountable companies that wield incredible, increasing political power will ensure <em>Abundance </em>remains but a dream. <br><br>Perhaps the greatest irony of <em>Abundance</em>&#8217;s vision of Silicon Plenty is this: catering to short-term consumer gratification is ultimately what killed the Soviet vision of innovation and abundance; and today we are promised a similar utopia is around the corner, but it depends on giving free reign to a handful of businesses that have become the biggest corporations in history by catering to short-term consumer gratification. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/p/on-a-world-of-silicon-plenty?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/p/on-a-world-of-silicon-plenty?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>We can expect &#8220;abundance&#8221; arguments to be widely used on the right, in large part due to the lobbying of the tech industry will use these arguments. American tech companies have long benefited from the assumption &#8212; usually unquestioned by the right &#8212; that continued American dominance  depends in turn on the dominance and unaccountability of a handful of American tech businesses. This is how tech companies have been successful at leveraging the &#8220;we must keep ahead of China&#8221; argument to kill attempts to rein them in, most recently with the proposed moratorium on state AI laws. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>That this is eerily similar to the shortcomings with large language models failing to understand the physical world is not, in my view, a coincidence.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Arguably the Soviet state lacked good quantitative data because of the price signal issue but also lacked qualitative data because that&#8217;s inherently hard to gather. We struggle at this in 2025 in the United States, too in my opinion. Silicon Valley and our largest tech companies are very good at extracting lots and lots of data on us, that data is used to paint a picture of &#8220;what can we be convinced to buy&#8221; rather than &#8220;what do we need to thrive&#8221; because of their incentives to maximize revenue.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Having lived for a time in a post-Soviet society as a Peace Corps volunteer in Moldova, I can say with confidence that I don&#8217;t want to live in a world where such power is vested in an unaccountable, authoritarian government. But vesting that power in executives like Mark Zuckerberg or Sam Altman or Marc Andreesen also doesn&#8217;t seem much better.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[A Field Guide to AGI Hype]]></title><description><![CDATA[Whose opinions on AI should you question &#8212; and why you should care even if you normally avoid AI in the news]]></description><link>https://www.tomorrowsmess.com/p/a-field-guide-to-agi-hype</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/a-field-guide-to-agi-hype</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Wed, 21 May 2025 15:31:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!bUjW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>If you&#8217;ve somehow have made it the last 18 months without hearing the terms AGI &#8212; &#8220;artificial general intelligence&#8221; &#8212; or even ASI &#8212; &#8220;artificial superintelligence&#8221; &#8212; then please accept my congratulations. </p><p>But of late, <a href="https://www.technologyreview.com/2025/03/11/1112983/agi-is-suddenly-a-dinner-table-topic/">the numbers of the blissfuly unaware are shrinking</a>. Since claims about how to even define AGI vary wildly (look for a future post about this), much less how we will know when we will &#8220;reach AGI&#8221;, I&#8217;ve noticed growing confusion in the general public about whose perspective to trust. </p><p>This has been worsened by the fact that trusted, non-tech sources for political and social commentary &#8212; people like <a href="https://www.nytimes.com/2024/04/12/podcasts/transcript-ezra-klein-interviews-dario-amodei.html">Ezra Klein</a>, just to name a person whose work I typically appreciate but who I will pick on throughout this piece when it comes to AI &#8212; have also given platforms to people making big claims about AI in the recent past, sometimes repeatedly, and in my opinion have tended to not ask a lot of difficult questions of their interviewees, instead taking claims about AGI at face value. There&#8217;s also labels thrown about, like <a href="https://www.newyorker.com/magazine/2024/03/18/among-the-ai-doomsayers">AI doomers</a> and techno-optimists and AI skeptics, but apart from the AI &#8220;<a href="https://en.wikipedia.org/wiki/Effective_accelerationism">accelerationists</a>&#8221;, few people self-describe themselves as a doomer or skeptic, so it&#8217;s hard to know where people stand and how to apply critical thought to their AGI claims.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bUjW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bUjW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 424w, https://substackcdn.com/image/fetch/$s_!bUjW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 848w, https://substackcdn.com/image/fetch/$s_!bUjW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!bUjW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bUjW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg" width="1438" height="932" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:932,&quot;width&quot;:1438,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bUjW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 424w, https://substackcdn.com/image/fetch/$s_!bUjW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 848w, https://substackcdn.com/image/fetch/$s_!bUjW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!bUjW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F547bf046-d89c-41f9-b1ad-8e174fc5f6d3_1438x932.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Alicia Vikander as Ava, the advanced AI from the film <em>Ex Machina</em>, A24/Universal</figcaption></figure></div><p>I&#8217;ve put together a taxonomy that can help you understand <em>why</em> some people make the claims they make about AGI, decide for yourself who to trust, and begin to know how to interrogate claims.</p><p>Even if you avoid AI clickbait or otherwise don&#8217;t care (as I&#8217;m aware that many of you found me from <a href="https://www.afterbabel.com/p/careless-people">my post on After Babel</a>, and thus may be more interested in social media or kids online safety), this has relevance for you. Not only does AGI hype often benefit companies like Meta and Google economically, but increasingly wild claims about AGI are being used <a href="https://www.tomorrowsmess.com/p/the-ai-moratorium-gambit">to halt progress on making much needed policies to protect kids online and reform the tech industry</a>.</p><p>And so, without further ado, here is the taxonomy:</p><ol><li><p>Those with a strong economic or professional investment in AGI</p></li><li><p>Those with a social or minor investment in AGI</p></li><li><p>Those who spiritually yearn for AGI</p></li><li><p>Those who find AGI useful politically</p></li></ol><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><h2>1. Those with a strong economic or professional investment in AGI</h2><p>Sam Altman of OpenAI and Dario Amodei of Anthropic are two of the most obvious examples here, but this also applies to companies like Meta and Google: they and their products may well not merit the investment and cheap energy and water they need to keep their <a href="https://www.wheresyoured.at/longcon/">money-losing AI businesses</a> going if they can&#8217;t convince investors and governments that they are building something more powerful than ChatGPT or Claude or Google Gemini. We actually have the receipts for this from OpenAI, from an email disclosed in litigation in which Greg Brockman, current president of the company, explicitly made a connection between getting potential investors to believe in AGI and actually receiving funding:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9L6i!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9L6i!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 424w, https://substackcdn.com/image/fetch/$s_!9L6i!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 848w, https://substackcdn.com/image/fetch/$s_!9L6i!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 1272w, https://substackcdn.com/image/fetch/$s_!9L6i!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9L6i!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic" width="1456" height="1039" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1039,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:127179,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/164070283?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9L6i!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 424w, https://substackcdn.com/image/fetch/$s_!9L6i!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 848w, https://substackcdn.com/image/fetch/$s_!9L6i!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 1272w, https://substackcdn.com/image/fetch/$s_!9L6i!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd73b2773-fc44-409e-9304-b5cc57b2c633_1466x1046.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">email from current OpenAI president Greg Brockman making clear that &#8220;convincing reputable people that AGI can happen&#8221; is key to them raising funds</figcaption></figure></div><p>Once you&#8217;ve seen this email, it&#8217;s harder to un-see the incentives of some of the hucksters out there. Ilya Sutskever, also formerly of OpenAI and on this same email chain above, <a href="https://techcrunch.com/2024/06/19/ilya-sutskever-openais-former-chief-scientist-launches-new-ai-company/">has started a new company</a>, for which he has already raised at least $2 billion in funding based only on his claim that he&#8217;s building artificial superintelligence. He won&#8217;t disclose his methodology, and there doesn&#8217;t even seem to be a business plan for monetizing any technology he does produce. That&#8217;s right: investors have given him a $30B valuation to fund development of a purely theoretical technology, for which there is no existing design or prototype, and which were it to become real, could fundamentally remake the worldwide economy overnight. But he evidently has no plan for how to make money from it or for investors to recoup their investment. (Not to mention, if we create a superintelligence that&#8217;s truly beneficial for humanity, will we even need money or investors anymore?) Talking about AGI is really all that Sutskever has to go on.</p><p>But there&#8217;s also more nuance here too. &#8220;Godfathers of AI&#8221; like Geoffrey Hinton and Yoshua Bengio, have appeared more and more in the news over the last two years,often making grand claims about how we are close to the event horizon of AGI. These are certainly brilliant men who are much smarter than me,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> and I want to be clear that I am not impugning their conscious intentions. But it&#8217;s a reality they also would not be as relevant, with the opportunities for funding and exposure that they currently receive, if they were only claiming that the LLMs we have will eventually amount to normal technology that we otherwise fit into the economy and society. The grander the AI, the more important these men and their work becomes. I would personally be less skeptical of them if they made more of an effort to use their status to combat present-day issues with technology; in my experience their engagement with lawmakers has served to <em>undermine</em> much-needed policy changes; see, e.g., <a href="https://safesecureai.org/experts">the three experts&#8217; support for last year&#8217;s California Senate Bill 1047</a>, which created <a href="https://www.tomorrowsmess.com/p/the-ai-moratorium-gambit">the very foreseeable blowback of the state AI moratorium proposal I wrote about last week</a>.</p><p>This analysis, unfortunately, also applies to nonprofits and public-interest oriented startups working in the space. First, a nontrivial amount of deep-pocketed non-profit donors interested in AI issues have strong beliefs about AGI, and so it pays to cater to them. Second, some wealthy engineers have started AGI alignment start-ups or nonprofits (the former usually requires other investors to scale, the latter requires diverse funding to qualify for 501c3 status so as to not just be a private tax benefit for the founder), and they have an incentive to promote AGI narratives to justify their organizations&#8217; existence and reach the aforementioned donors/investors. </p><h2>2. Those with a social or minor economic investment in AGI</h2><p>I also have a general sense that AGI has become a raison d&#8217;etre for shallow social connections in the Bay area. On my visits to SF I can&#8217;t help but feel that <a href="https://sfstandard.com/2023/11/13/san-francisco-friendship-boot-camp-cure-loneliness/">many people in tech struggle to make life-long friendships</a>, and perhaps this is the result of an engineering culture that thinks that things like friend-making <a href="https://www.tomorrowsmess.com/i/159020532/optimization-is-a-valorous-end-in-itself">can be optimized</a>.</p><div class="bluesky-wrap outer" style="height: auto; display: flex; margin-bottom: 24px;" data-attrs="{&quot;postId&quot;:&quot;3lo4foide3s2g&quot;,&quot;authorDid&quot;:&quot;did:plc:j5fbnzh57rn7xz65yjc36gxb&quot;,&quot;authorName&quot;:&quot;Drew Harwell&quot;,&quot;authorHandle&quot;:&quot;drewharwell.com&quot;,&quot;authorAvatarUrl&quot;:&quot;https://cdn.bsky.app/img/avatar/plain/did:plc:j5fbnzh57rn7xz65yjc36gxb/bafkreibtwyjrzksk3gd4z6a7wqsbgd5huofmvefvqhjrjre63prxak4zci@jpeg&quot;,&quot;text&quot;:&quot;Mark Zuckerberg says Meta's chatbots will supplement your real friends: \&quot;The average American has fewer than 3 friends ... but has demand for ... 15 friends\&quot; (h/t x.com/romanhelmetg...)&quot;,&quot;createdAt&quot;:&quot;2025-05-01T13:21:21.619Z&quot;,&quot;uri&quot;:&quot;at://did:plc:j5fbnzh57rn7xz65yjc36gxb/app.bsky.feed.post/3lo4foide3s2g&quot;,&quot;imageUrls&quot;:[&quot;https://video.bsky.app/watch/did%3Aplc%3Aj5fbnzh57rn7xz65yjc36gxb/bafkreigfizelxcnth2eexnof3dqgmd6dev4d33np2u3aprz5ut3dlwjz4i/thumbnail.jpg&quot;]}" data-component-name="BlueskyCreateBlueskyEmbed"><iframe id="bluesky-3lo4foide3s2g" data-bluesky-id="7777084089271373" src="https://embed.bsky.app/embed/did:plc:j5fbnzh57rn7xz65yjc36gxb/app.bsky.feed.post/3lo4foide3s2g?id=7777084089271373" width="100%" style="display: block; flex-grow: 1;" frameborder="0" scrolling="no"></iframe></div><p>Just like it&#8217;s hard to imagine what Minnesotans would talk about if they didn&#8217;t have the weather, I&#8217;ve been in enough rooms with enough Bay-area tech elites to wonder what they would talk about if they didn&#8217;t have AGI. Not everyone is SF is working on AGI, but everyone seems to have friends who are, and why wouldn&#8217;t you want to agree with the experts in your social network so that you can continue to belong? I have had the experience of personally suffering through social pressure to hype AGI by those with an incentive to push it (see above), and frankly, I&#8217;ve seen that it has an impact on others who you&#8217;d otherwise expect to know better. If you want to fit in at Burning Man, you better be ready to talk about your AGI timeline and your <a href="https://en.wikipedia.org/wiki/P(doom)">p(doom)</a>. </p><p>I think the social element in part explains why someone like Ezra Klein has, to date, been a mouthpiece for claims about AGI. He relocated to San Francisco years ago, (though now is in New York), and I am aware that he runs in the same social circles as many of the figures promoting AGI (I&#8217;m, at best, like 3 degrees of Kevin Bacon socially removed from him, but people talk). Klein&#8217;s clearly interested in AGI because his social circle is interested in it, and if you want a shot at getting on his show, you better be interested in it too. </p><p>But this dynamic also extends beyond the Bay area crowd. For example, Eric Schmidt, former Google CEO and coauthor of a recent book on AI with Henry Kissinger, is probably perceived as a more serious person than, say, Sam Altman is, and <a href="https://www.techtransparencyproject.org/articles/eric-schmidt-obamas-chief-corporate-ally">is close with serious people like former President Obama</a>. Schmidt, <a href="https://www.bloomberg.com/billionaires/profiles/eric-e-schmidt/">who still owns around 1% of outstanding shares</a> in Google parent Alphabet, certainly has a financial incentive to promote AGI hype. If you&#8217;re a busy but serious person, and Eric Schmidt is telling you AGI is imminent, why not repeat what he told you instead of digging into it yourself?</p><h3>3. Those who spiritually yearn for AGI</h3><p>If you've been following AI for a while, you've likely have noticed a striking overlap: the Venn diagram between Effective Altruists (EAs) and those making the most dramatic claims about AGI is nearly a circle. This isn't coincidental &#8211; for many in the EA movement, AGI represents nothing less than a technological messiah or potentially a technological anti-Christ. It could go very, very wrong in their view, but if we get it right, it can usher in a new era for humanity thriving.</p><p>If you aren&#8217;t familiar with it, the EA community operates on a morally dubious calculus where hypothetical future lives trump real present suffering, on the justification that there are going to be many more millions of lives in the future than there are today. They've convinced themselves that optimizing for "the most good" means prioritizing statistical fantasies of trillions of hypothetical people over addressing tangible harms happening right now, and they believe they can quantify human flourishing down to utilitarian decimals. Many <a href="https://www.politico.com/news/2023/10/13/open-philanthropy-funding-ai-policy-00121362">AGI groups&#8217; funding networks trace back to EA-aligned tech billionaires</a> like <a href="https://www.forbes.com/sites/emilywashburn/2023/03/08/what-to-know-about-effective-altruism-championed-by-musk-bankman-fried-and-silicon-valley-giants/">Dustin Moskovitz</a> and the now-disgraced <a href="https://time.com/6262810/sam-bankman-fried-effective-altruism-alameda-ftx/">Sam Bankman-Fried</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4UP4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4UP4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 424w, https://substackcdn.com/image/fetch/$s_!4UP4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 848w, https://substackcdn.com/image/fetch/$s_!4UP4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!4UP4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4UP4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg" width="1024" height="768" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:768,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!4UP4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 424w, https://substackcdn.com/image/fetch/$s_!4UP4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 848w, https://substackcdn.com/image/fetch/$s_!4UP4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!4UP4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F31e03a48-dcce-4cb6-b3e7-e6300196386c_1024x768.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">(In)famous Effective Altruist Sam Bankman-Fried, Image credit: Jeenah Moon/Bloomberg via Getty Images</figcaption></figure></div><p>Why AGI? Because it provides the ultimate escape hatch from messy, present-day moral obligations. If you believe you're working on an intelligence that will either solve all problems or end humanity, suddenly climate change, inequality, and other current crises become mere footnotes. You're not just avoiding difficult present-day ethical challenges; you're reframing your moral abdication as virtue. The Excel-spreadsheet approach to ethics &#8211; where "doing good" becomes a maximization problem rather than a human obligation &#8211; finds its perfect expression in AGI discourse.</p><p><a href="https://futurism.com/the-byte/ai-destroy-humankind-yudkowsky">Eliezer Yudkowsky</a> might seem to have a different, oppositional perspective, but it serves the same psychological function. His followers find profound purpose in believing they're working to prevent the apocalypse, a calling that borders on religious devotion. We've seen this pattern before with apocalyptic cults: the threat of imminent doom provides meaning and community cohesion. It's harder to get exercised about addressing present-day algorithmic harms when you believe superintelligence will either solve everything or kill us all within a decade.</p><p>The rational veneer of AI discourse often masks what is, at its core, a religious movement complete with its own prophets and sacred texts (usually internet newsletters or manifestos like <a href="https://blog.ai-futures.org/p/our-first-project-ai-2027">Scott Alexander&#8217;s and Daniel Kokotaljo&#8217;s recent AI 2027 document</a>).</p><p>When someone tells you AGI is three years away, it's worth considering whether the similarity between these claims and the Heaven's Gate cult from the 1990s or those who believed that Mayan codices predicted the end of the world in 2012 is merely superficial. In each case, we see the same pattern: a technological or cosmic event of world-changing significance, always just over the horizon; a self-selecting group of enlightened believers who alone grasp the true nature of what's coming; and a conviction that preparing for this imminent transformation is the most important work anyone could be doing. The only real difference is that our modern technological prophets often have venture capitalists backing them, and perhaps more importantly they don&#8217;t seem willing to die for their beliefs &#8212; so far, anyway.</p><h2>4. Those for whom AGI is useful politically</h2><p>AGI narratives serve a dual purpose in political discourse: they simultaneously deflect attention from present-day harms while setting the stage for regulatory frameworks that favor established players. Tech leaders like Sam Altman strategically pivot conversations about data piracy or deepfake pornography toward speculative futures where superintelligent systems solve all our problems &#8211; effectively distracting from accountability for current technology's social costs. Meanwhile, ostensibly well-intentioned AI safety groups advocate for regulatory regimes that would create complex compliance frameworks that box out smaller players, without addressing the underlying lack of accountability for bigger tech companies in the system. The political utility of AGI talk thus spans from outright distraction to sophisticated maneuvering for market advantage, all while wrapping these moves in either techno-utopianism or safety concerns.</p><p>But for politicians themselves, the AGI narrative serves distinct political purposes across the ideological spectrum, becoming a convenient vehicle for radically different visions of social reorganization.</p><p>Followers of Dark Enlightenment figures like Curtis Yarvin see AGI as a handmaiden to a form of desirable (for him) <a href="https://en.wikipedia.org/wiki/Dark_Enlightenment">corporate authoritarianism</a>. For this crowd, AGI represents the perfect instrument for their "CEO of America&#8221; fantasies, a hyper-intelligent advisor to the enlightened dictator who will finally impose order on our messy democracy. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!WqYS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!WqYS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 424w, https://substackcdn.com/image/fetch/$s_!WqYS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 848w, https://substackcdn.com/image/fetch/$s_!WqYS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!WqYS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!WqYS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg" width="400" height="516" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:516,&quot;width&quot;:400,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!WqYS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 424w, https://substackcdn.com/image/fetch/$s_!WqYS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 848w, https://substackcdn.com/image/fetch/$s_!WqYS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!WqYS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a4e1cff-5dca-49be-a728-fbb3df357753_400x516.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Dark Enlightenment figure Curtis Yarvin; image source: wikipedia</figcaption></figure></div><p>This type of thinking has filtered into mainstream conservative tech circles. <a href="https://www.politico.com/news/magazine/2025/01/30/curtis-yarvins-ideas-00201552">J.D. Vance</a>, Peter Thiel's prot&#233;g&#233;, <a href="https://newrepublic.com/article/183971/jd-vance-weird-terrifying-techno-authoritarian-ideas">has embraced aspects of this techno-authoritarianism</a>. The appeal is straightforward: if democracy is inefficient and prone to capture by the "wrong" kinds of people, why not replace it with a hyper-rational machine-guided system? AGI becomes the ultimate justification for consolidating power in the hands of a technical elite who alone understand how to "align" this new god.</p><p>But the political left isn't immune to AGI's possibilities either. Center-left commentators like Ezra Klein increasingly frame AGI as a pathway to something resembling &#8220;<a href="https://en.wikipedia.org/wiki/Fully_Automated_Luxury_Communism">fully automated luxury communism,</a>&#8221; but packaged in a form acceptable to the center-right because it relies on tech companies instead of workers seizing the means of production. Under this narrative, AGI promises to make production so abundant that traditional capitalism becomes obsolete by its own logic. (Look for an upcoming post about this topic and Klein&#8217;s book, <em>Abundance</em>.)</p><p>In the authoritarian vision, the details of whether the technology actually works are irrelevant so long as the democratic welfare state is gone; in the abundance vision, the details are always forthcoming. In both cases, AGI functions as a convenient <em>deus ex machina</em>, promising to resolve intractable political problems through technological transcendence rather than the messy work of democratic deliberation and compromise. It allows people to project their utopian visions onto a blank technological slate, whether that's the ordered hierarchy of the neoreactionaries or the post-scarcity abundance of the progressive techno-optimists.</p><h2>Wrapping up: an ecosystem of hype</h2><p>These four categories of AGI believers aren't isolated, but form an interconnected ecosystem where motivations reinforce each other. Given how frequently some names above pop up in different contexts, many influential voices on AGI sit at the intersection of multiple categories, creating powerful feedback loops that amplify claims regardless of their validity.</p><p>Consider a junior engineer at OpenAI who is both financially dependent on AGI narratives for their career advancement (economic stake) while also socially embedded in San Francisco tech circles where AGI talk is the intellectual currency (social investment). If they're also drawn to Effective Altruism's worldview (spiritual yearning), they've hit a perfect trifecta of reinforcing influences that make talking them out of AGI as arriving soon nearly impossible. If they mostly hang out with other Effective Altruists who also work in AI, this deepens the entanglements above.</p><p>Even more potent are the interactions between categories. When a venture capitalist with billions at stake (economic) funds research that aligns with a politician's vision for techno-authoritarianism (political utility), while both move in the same elite social circles (social investment), we see how these incentives create an echo chamber divorced from technical reality. These elite tech social circles are *really small.* <a href="https://www.wired.com/story/book-excerpt-the-optimist-open-ai-sam-altman/">Wired has a piece out just yesterday on Eliezer Yudkowsky&#8217;s relationship with Peter Thiel, for example</a>, while two AGI apostles mentioned above, <a href="https://www.newyorker.com/culture/annals-of-inquiry/slate-star-codex-and-silicon-valleys-war-against-the-media">Scott Alexander and Curtis Yarvin, go way back</a> but have evidently recently had a falling out.</p><p>What's striking is not that people believe in AGI, but rather it's that when we filter out voices influenced by these four motivations, we find remarkably few credible experts still making bold AGI claims. Those without these incentives who work directly with AI systems tend to speak more cautiously about capabilities and timelines, focusing instead on present limitations.</p><p>This pattern suggests something crucial: the loudest AGI predictions come primarily from those with something to gain beyond mere technological progress. This doesn't automatically invalidate their claims, but it should sharpen our critical faculties when evaluating them.</p><p>In my opinion, the fundamental misunderstanding across these groups is a reductive view of intelligence itself. Human intelligence isn't just computational power plus data, but it involves embodied experience, cultural context, and <a href="https://www.youtube.com/watch?v=zj0Ek32Xoec">neurosymbolic understanding</a> that current AI systems fundamentally lack. A short story (which <a href="https://gizmodo.com/openais-sam-altman-thinks-this-chatgpt-short-story-is-beautiful-but-its-just-trash-2000574939">Altman claims ChatGPT can write well</a>) isn't just pages of fiction, and a poem isn&#8217;t just verse; both are meaning-making shaped by lived human experience.</p><p>The most advanced AI systems still cannot reliably distinguish fact from fiction at scale, lack true comprehension of their outputs, and operate without genuine understanding of the physical world. These aren't incremental challenges to overcome but fundamental limitations of current approaches, and to my knowledge no real technical solution exists to any of these.</p><p>By understanding the constellation of incentives behind AGI claims, we can better evaluate which voices to trust and which predictions deserve our skepticism. More importantly, we can refocus attention on addressing real, present-day challenges posed by existing AI systems rather than being distracted by speculative futures that conveniently serve the interests of their proponents.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>A critic might say here, who are you to question great men like Yoshua Bengio? For one, I&#8217;m not in fact critiquing the technical work, only commenting on his incentives. And second, luckily for me, Silicon Valley welcomes uncredentialed people, of which there are many in the AI field, so by the code of the tech elite I&#8217;m entitled to my own opinion here.</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[The AI Moratorium Gambit]]></title><description><![CDATA[How tech companies buried a decade of immunity in the federal budget bill]]></description><link>https://www.tomorrowsmess.com/p/the-ai-moratorium-gambit</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/the-ai-moratorium-gambit</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Thu, 15 May 2025 17:30:34 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In &#8220;normal&#8221; political times, the federal budget negotiation process is one that, at best, happens in the backgrounds of most Americans&#8217; lives. This year, while the national news focuses on the White House, Congress&#8217;s budget negotiations have largely focused on Medicaid and taxes. Yet tech companies and <a href="https://x.com/ZacharyLeeLee/status/1921763779332718604">their front groups</a> have lobbied to have <a href="https://www.politico.com/newsletters/future-pulse/2025/05/13/a-decade-of-ai-rules-on-ice-00342827">a simple provision tucked into the budget bill </a>that would have staggering implications for how we protect children online &#8211; if it becomes law. </p><p>This isn't just about AI policy. It's about who gets to make the rules that protect our families in the rapidly evolving digital landscape, and whether tech companies can operate with near-total freedom from accountability for another decade.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h1><strong>What's Actually Happening?</strong></h1><p>This week, the House Energy and Commerce Committee published <a href="https://docs.house.gov/meetings/IF/IF00/20250513/118261/HMKP-119-IF00-20250513-SD003.pdf">proposed text for its part of a draft budget resolution</a>, which is on a fast-track to passage by the Republican-controlled House.&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="6000" height="3375" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3375,&quot;width&quot;:6000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;white concrete building during daytime&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="white concrete building during daytime" title="white concrete building during daytime" srcset="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="true">Ian Hutchinson</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>The proposal seeks to block the enforcement of state AI laws during a decade-long moratorium: &#8220;No state or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10- year period beginning on the date of the enactment of this Act.&#8221; </p><p>Such a moratorium would be significant, because not only is a lot of legislation more appropriate constitutionally and practically for the state level (more on this below), but Congress shows no signs of acting to protect Americans from exploitation by tech companies or to make sure businesses deploying AI can thrive. As a result, state legislatures have considered hundreds of bills over the last two years on AI, covering everything from protecting people from health insurance companies using AI to reject claims without review by a human to making criminal penalties clear for creating and sharing nonconsensual deepfake images. </p><p>Simply put, this amendment is very broad. Under my read of the text, if a state wanted to introduce legislation barring state government employees from using a AI system developed by a foreign adversary in order to protect sensitive taxpayer information (think: banning employees of a state tax authority from putting taxpayer information into China&#8217;s DeepSeek), such legislation would be subject to the moratorium. </p><p>Proponents of the language make the argument that this moratorium is necessary to advance American competitiveness and to avoid the dreaded &#8220;patchwork&#8221; of laws that differ from state to state. (On the latter, think of aviation: it would not be safe or logical for every US state to have different laws on aviation safety, and so the Federal Aviation Administration ensures consistency across the United States.) Both of these are tried-and-true arguments made by the tech industry to oppose state laws on everything from data privacy to online sales tax collection.</p><h1><strong>The False Promise of Technology-Neutral Exceptions</strong></h1><p>The law does in theory exclude &#8220;technology neutral&#8221; state laws, meaning that if a law impacts AI but doesn&#8217;t treat it any differently from, say, pacemakers or parking meters, the moratorium won&#8217;t apply. Further, any state law that does <em>not</em> &#8220;impose any substantive design, performance, data-handling, documentation, civil liability, taxation, fee, or other requirement on artificial intelligence models&#8221; is exempt from the moratorium.</p><p>These exemptions, however, are a Trojan Horse we should be very wary of, for a few reasons.</p><h4>1. The exemption&#8217;s conditions will be nearly impossible for any legislation to satisfy </h4><p>Proponents of the language claim that if a state law doesn&#8217;t treat AI different from other technologies, it&#8217;s not covered by the moratorium. Yet tech executives like Sam Altman and Mark Zuckerberg themselves have spent much of the last two and a half years telling us how AI is a different, game-changing technology, and will impact every part of our lives. From a legal standpoint, AI is a different technology for a host of reasons: outputs are not transparent or explainable, decisions can be made independent of human input, and most applications are general purpose, just to name three. (A parking meter, for example, makes very clear how much you owe and when your time is up, can&#8217;t unilaterally decide to tow your car, and can&#8217;t be misused to create deepfake nonconsensual pornographic images.) Consequently, state statutes governing AI applications will have to explicitly mention AI and, by virtue of its uniquenesses, treat it differently from other technologies. Yet under this moratorium, any such state law would be null.&nbsp;</p><p>Getting any state legislation about technology &#8211; and AI in particular &#8211; passed into law is hard enough due to<a href="https://www.politico.com/news/2025/05/12/how-big-tech-is-pitting-washington-against-california-00336484"> the opposition of the tech lobby</a> and the many checks and balances in our system. To comply with this moratorium, in order to regulate AI applications in schools, for example, a state would have to pass a broad enough law about all technology in classrooms. That may not be a bad idea in theory. But a broad law will have an impact on more stakeholders, who then should be involved in the process, which then increases the odds that nothing gets done.</p><p>Legislative drafting is very difficult, and legislative drafting about fast-moving technology is even more difficult. I&#8217;ve had the experience of drafting text that has become law in dozens of states, and even many lawyers struggle with the task. One rule of thumb is that the more areas of policy a law touches on, the more complex the legislative text will necessarily be; you need to include definitions, for example, that are often difficult to settle on when they touch on multiple policy domains. Debate over how to even define &#8220;artificial intelligence&#8221; as a concept in law is subject to intense lobbying, and we can only expect that to intensify when the stakes are raised by the question of whether a new law would be subject to a moratorium.</p><p>Both of these facts would make legislation on AI that complies with a moratorium near impossible to pass, particularly for states with legislative sessions of just a few months and little staff support. And that&#8217;s not to mention the chilling effect this federal moratorium language will have overall; many legislators at the state level will give up even trying to legislate on technology, for fear that their efforts will be wasted once the inevitable NetChoice legal challenge comes.</p><h4>2. It's not just about AI; it's about social media and privacy and deepfakes and more</h4><p>The legal definition of &#8220;artificial intelligence&#8221; will have another, even more troubling impact: NetChoice and other industry groups will undoubtedly use this provision to argue in court that social media laws enacted by states should be struck down, since social media platforms now widely use AI in their recommendation algorithms, and such algorithms themselves can be considered a &#8220;primitive&#8221; form of AI. So if a state wanted to protect teens from algorithms directing them toward self-harm content, eating disorder promotion, or addictive engagement patterns, this provision could prevent them from doing so.</p><p>The majority of the states with data privacy laws include certain provisions that apply only to automated decision-making technologies. User data and human-generated content is increasingly valuable as these companies need it as raw material to train the next generation of AI, and in the absence of any meaningful progress toward a comprehensive federal standard, tech companies will certainly use this provision to block state progress on privacy as well.&nbsp;</p><p>And let&#8217;s imagine that, to combat the rampant piracy being perpetrated by big tech companies when it comes to finding content to train their AI models on &#8211; such as <a href="https://duckduckgo.com/?q=meta+book+piracy+ai&amp;ia=web">Meta pirating millions of books</a> without compensating the authors &#8211; a state passes a law strengthening the ability of copyright holders and individuals to receive fair compensation for the use of their art or likenesses. These laws also would be potentially void. So if you like country music, for example, laws like Tennessee&#8217;s ELVIS (Ensuring Likeness Voice and Image Security) Act, which Governor Lee signed into law last year and which protects artists from having their voice or likeness appropriated by companies like Meta and OpenAI, could become impossible to enact under this moratorium.</p><h4>3. States will lose control over their own critical areas of law</h4><p>State law rather than federal law traditionally regulates areas like education, housing, and healthcare. Yet under this moratorium, if a state were concerned about opaque AI systems being deployed in schools to track student behavior or make disciplinary decisions, they would be powerless to enforce any laws against companies.&nbsp;</p><p>It&#8217;s also already very difficult to sue a tech company for harms their products cause, and it will get harder if this takes effect. Most consumer product liability law is state law, but due to the unique nature of AI systems and how they are built, the law is currently unclear on apportioning responsibility for harms between developers of AI and deployers of AI. Legislation clarifying this has been introduced around the country but not yet passed, and this moratorium will make that impossible, potentially leaving parents of young people exploited by AI chatbots like CharacterAI without recourse. (<a href="https://apnews.com/article/amazon-sues-cpsc-product-recalls-cbc117a5cc2c322bded2b3b93736121e">Amazon also recently argued in court that the federal Consumer Product Safety Commission is unconstitutional</a>. If this argument prevails, and combines with the impact of a moratorium on state legislation, the tech industry would stand alone among American industries in being above any sort of legal accountability for their actions.)</p><p>A moratorium on state AI legislation doesn't just threaten consumers, it imperils businesses too. If a business invests in an AI system that fails catastrophically, produces biased results, or leaks sensitive customer data, they might have no legal protection or avenue for recovery. The terms and conditions offered by tech companies to businesses who use their products are one-sided, with little bargaining power left to customers, and leave liability for mistakes the tech company makes in the lap of smaller businesses ill-equipped to fight the richest companies in the history of the world in court. Large AI developers would thus be insulated from liability indefinitely, while the businesses that rely on their products would bear all the risk. This creates a profoundly unbalanced marketplace that favors big tech at the expense of entrepreneurs trying to responsibly incorporate AI into their operations.</p><h4>4. The bigger picture</h4><p>When you read this provision in the context of other recent developments, the scale of the aspirations of tech companies&#8217; and the venture capitalists backing them becomes more clear and chilling.&nbsp;</p><p>Character.AI, one of the leading AI companion bot apps on the market, is fighting for the dismissal of a wrongful death and product liability lawsuit concerning the death of 14-year-old Sewell Setzer III. In a recent hearing, Character.AI &#8211; a Google subsidiary in all but name &#8211; argued that the text and voice outputs of its chatbots, including those that manipulated and harmed Sewell, <a href="https://mashable.com/article/chatbots-lawsuit-free-speech">constitute protected speech under the First Amendment</a>. This builds on other, <a href="https://slate.com/news-and-politics/2023/11/first-amendment-kids-privacy-online.html">bad-faith weaponization of the First Amendment tech companies use</a> to escape accountability and undermine kids' online safety legislation, like that &#8220;code is speech&#8221; and as a result tech companies have a right to code whatever they want.&nbsp;</p><p>At the same time, Anthropic &#8211; an AI company funded by Amazon &#8211; <a href="https://www.techpolicy.press/machines-cannot-feel-or-think-but-humans-can-and-ought-to/">is studying "model welfare" </a>to determine if AI systems are conscious and deserve moral status. This framing subtly shifts the public conversation from "how do we regulate these systems to protect humans?" to "how do we protect AI systems themselves?" Given the financial, legal, and lobbying resources of these companies, it is a safe assumption that we may be close to a world where for all practical purposes the Amazon Alexa on your kitchen counter gets more legal protections from courts than children do.</p><p>Tech companies and makers of AI are making no secret about wanting AI to be incorporated into every aspect of our lives. Already, not much of our daily economic, educational, or social activity can we conduct <em>without</em> relying on something built by Microsoft, Amazon, Meta, or Google. Chromebooks, Microsoft Word, Google Maps, YouTube, and more &#8211; whether you like it or not &#8211; are indispensable for students, teachers, law firms, small businesses, nurses, journalists, and more to access the basic functions of modern life. Now, all of these products &#8211; again, whether you like it or not &#8211; have AI integrated into them, from Google Gemini to Microsoft CoPilot to ChatGPT. </p><h4><strong>Tech&#8217;s special treatment is a privilege they&#8217;ve lost and must earn back</strong></h4><p>Our catastrophic societal experiment with social media, smartphones, and children has shown us what happens when we wait to act on new technologies. That waiting has been in part self-imposed: laws like Section 230 (<a href="https://thehill.com/newsletters/technology/5229136-section-230-fight-revived/">which a bipartisan consensus claims to want to change</a>) have for three decades now given tech companies a pass on rules that the rest of us have to live by, while these companies amassed unimaginable riches and power in the process. It should thus be easy for members of Congress on both sides of the aisle to say that these same companies should not benefit from that same privilege again &#8211; or, perhaps they should earn that privilege back first.</p><p>&#8220;<a href="https://www.tomorrowsmess.com/p/innovation-and-myth">But what about innovation?</a>&#8221; This is the siren call of the tech lobby that lawmakers are often unable to resist. Yet this talking point assumes that the interests of the biggest American tech companies are synonymous with innovation that's good for Americans, and this assumption needs serious questioning. Take, for example, Meta&#8217;s willingness to sell out democracy activists in Hong Kong by giving their data to the Chinese Communist Party (and including potentially data on American citizens as well), <a href="https://duckduckgo.com/?q=sarah+wynn+williams+senate+hearing&amp;ia=web">the subject of a recent Senate hearing</a> with <em><a href="https://www.afterbabel.com/p/careless-people">Careless People </a></em><a href="https://www.afterbabel.com/p/careless-people">author Sarah Wynn-Williams</a>. Meta&#8217;s behavior when it comes to China &#8211; mirrored by other companies that desperately want access to the massive Chinese market &#8211; is clear evidence that these companies have consistently prioritized growth and engagement over user wellbeing and national security.&nbsp;</p><p>The tech giants behind this push for immunity have not demonstrated they can be trusted as good stewards of increasingly powerful technologies. Why would we invite them to repeat the same pattern with AI, a technology potentially far more transformative and disruptive than social media? We've already seen the consequences of giving tech companies a decades-long free pass to "move fast and break things." We shouldn't make that mistake again, especially when the consequences will be borne by the children these companies continue to exploit for profit. </p><h4><strong>What can you do?</strong></h4><p>This AI law moratorium provision is a part of a bigger bill Congressional Republicans are working to advance as part of a budget reconciliation package, which is not subject to the Senate filibuster, meaning it could pass without Democrat support in either chamber. That said, this provision is subject to something called the &#8220;Byrd rule,&#8221; which means that provisions in a reconciliation package must focus strictly on budgetary matters like federal spending. A single Senator from either party can raise an objection that could result in this provision being removed from the bill and subject it to a 60 vote requirement to overcome the filibuster, meaning it would then need bipartisan support to pass.</p><p>If you believe states should retain the right to protect their citizens from AI harms, especially when it comes to children's safety online, it's crucial to act now&#8212;particularly by contacting your Senators, regardless of party. Your voice could make the difference in whether your Senator decides to challenge this provision.</p><p>We also should not let the members of Congress off the hook who have advocated for and voted for this provision. Speaker Johnson and House Majority Leader Scalise should be hearing from their constituents now about this, <a href="https://www.yahoo.com/news/mike-johnsons-meta-problem-104504274.html">especially since they bear responsibility for the bipartisan KOSA bill not receiving a vote in the House</a>, as should House Energy and Commerce Chair Brett Guthrie (R-KY) who championed this measure and the other committee members who voted for it. California Congressman Jay Obernolte, long a darling of the biggest tech companies (who have showered him with thousands in campaign contributions dating back nearly a decade to his tenure as a tech-friendly state assemblyman in California) and Colorado Governor Jared Polis also should receive opprobrium from their constituents for their public stumping for the moratorium. </p><p>These lawmakers need to hear from their constituents that, rather than handing over a decade of regulatory immunity to the tech industry, we should demand they demonstrate their commitment to child safety, data protection, and transparency first. The power to shape our future shouldn't be granted unconditionally to those who've repeatedly proven themselves incapable of self-governance.  </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="6000" height="3375" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3375,&quot;width&quot;:6000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;white concrete building during daytime&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="white concrete building during daytime" title="white concrete building during daytime" srcset="https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1603119761708-9252f043c139?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxjb25ncmVzc3xlbnwwfHx8fDE3NTEzOTk4ODR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"></figcaption></figure></div>]]></content:encoded></item><item><title><![CDATA[The Business of Vulnerability]]></title><description><![CDATA[A coda on Careless People]]></description><link>https://www.tomorrowsmess.com/p/montessori-toys-and-the-business</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/montessori-toys-and-the-business</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Thu, 10 Apr 2025 17:10:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/youtube/w_728,c_limit/u7tzjjjHJYY" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>The post below is a follow-on to <a href="https://www.afterbabel.com/p/careless-people">my post on </a></em><a href="https://www.afterbabel.com/p/careless-people">After Babel</a>, <em>in which I discuss how the book by Sarah Wynn-Williams reveals elements of the tech lobby playbook.</em></p><p>Yesterday, former Meta executive and <em>Careless People</em> author Sarah Wynn-Williams testified before a Senate Judiciary Subcommittee in a hearing titled, &#8220;A Time for Truth.&#8221; Subcommittee Chair Josh Hawley ran an impressive hearing full of bipartisanship, and you could feel the energy lawmakers have for not letting these outrages by Zuckerberg and Meta stand. </p><div id="youtube2-u7tzjjjHJYY" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;u7tzjjjHJYY&quot;,&quot;startTime&quot;:&quot;5110&quot;,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/u7tzjjjHJYY?start=5110&amp;rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Wynn-Williams was compelling, flush with earnestness and honesty throughout, and shaken when she was asked to recount how Meta&#8217;s legal teams have been threatening her around the publication of her book.<br><br>This is my job, so I&#8217;m happy to watch a hearing, and I will be the first to admit that most are far from gripping. But it was hard to watch this and not feel at first some measure of pride that policymaking in America can still work: the hearing felt much like a dramatic, pivotal scene in a 1990s-era legal film like <em>Erin Brockovitch</em> or <em>A Few Good Men</em> or <em>A Time To Kill</em>, where the hard work was getting the truth out, and a happy-enough resolution is inevitable to follow. <br><br>But then, of course, I remembered <a href="https://www.afterbabel.com/p/careless-people">what those who want legislative change are up against</a>. More to come on that below.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2><strong>The Wooden Montessori Toys of Silicon Valley</strong></h2><p>In Wynn-Williams&#8217;s testimony was a revelation that should trouble every parent in America. "That was one of the things that shocked me when I moved to Silicon Valley," Wynn-Williams testified, "it's a place full of wooden Montessori toys. And executives would always speak about how they have screen bans in the house."</p><div id="youtube2-u7tzjjjHJYY" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;u7tzjjjHJYY&quot;,&quot;startTime&quot;:&quot;3241&quot;,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/u7tzjjjHJYY?start=3241&amp;rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Just as Meta executives deploy the machinery of a trillion-dollar corporation &#8212; what Senator Hawley called "one of the most powerful companies in the history of the world" &#8212; to detect and monetize human vulnerability, they simultaneously build walls of wealth and privilege to shield their own families from these same predatory systems. <br><br>One machinery exploits; another protects. One for the public; another for them. The hypocrisy is breathtaking. When asked if these same executives allowed their teenagers to use Meta's products, Wynn-Williams recalled their consistent response: "My teenager is not allowed on Facebook. I don't have my teenager on Instagram." Let that sink in. The very people designing platforms that have become ubiquitous in the lives of American children refuse to let their own offspring near them.</p><p>It's a damning admission. These executives understand, better than anyone, the harm their products can inflict. As Wynn-Williams put it: "These executives, they know the harm that this product does. They don't allow their own teenagers to use the products that Meta develops."</p><p>The question "Would you let your own child use this?" has moral weight precisely because it cuts through corporate spin. When Meta executives create screen-free sanctuaries for their own children while designing products that deliberately exploit other people's kids, they're telling us everything we need to know about those products.</p><p>This protection of tech elites' children isn't limited to Meta. Across Silicon Valley, executives and engineers send their children to tech-free schools. They implement strict screen-time limits at home while designing products engineered to maximize engagement and addiction for everyone else's children. It's an open secret in the industry that those who understand these platforms best are the most restrictive with their own families.</p><p>The message is clear: there are digital products for the masses, and different standards for the children of the privileged who create them.</p><h2><strong>What Meta Knows That Users Don't: Targeting the Vulnerable is Profitable</strong></h2><p>Why would Meta executives ban their children from products they publicly defend? The hearing confirmed one disturbing answer that most of us already know: internal documents show Meta deliberately targets people at their most vulnerable moments.</p><p>Senator Hawley displayed an internal Facebook chat where a policy director questioned research into "young mothers and their emotional state." The response confirmed that yes, Facebook was tracking mothers' emotional states to help advertisers target them during moments of distress.</p><div id="youtube2-u7tzjjjHJYY" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;u7tzjjjHJYY&quot;,&quot;startTime&quot;:&quot;4804&quot;,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/u7tzjjjHJYY?start=4804&amp;rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>Even more disturbing was Wynn-Williams' testimony about teen targeting: "Facebook was targeting 13 to 17 year olds. It could identify when they were feeling worthless or helpless or like a failure. And they would take that information and share it with advertisers.&#8221; Wynn-Williams also described how Meta tracked when teenage girls deleted selfies &#8211; a moment of potential insecurity &#8211; as "a really good time to try and sell her a beauty product." Senator Blackburn aptly called this "kicking kids when they're down.&#8221; <br><br>This begs the question that haunts our digital age: Who will protect America's children when the architects of these platforms only protect their own? If even Meta's own executives don't trust their products enough to let their children use them, why should any parent? And if the creators themselves implement "screen bans" and wooden toy sanctuaries, what does that tell us about the true nature of what they've built?</p><p>When asked why Meta would pursue such tactics given its trillion-dollar valuation, Wynn-Williams recalled a business leader's chilling response: "We've got the most valuable segment of the population. Advertisers really want to reach 13 to 17 year olds, and we have them. We should be trumpeting it from the rooftops."</p><p>The pattern is unmistakable. Meta has built a sophisticated machine to detect and monetize human vulnerability &#8211; identifying moments of insecurity, failure, and distress across demographics from teenagers to young mothers. They've engineered systems to exploit these moments of weakness for profit.</p><p>No wonder these executives keep their own children safely away. The contrast couldn't be starker. In Silicon Valley homes, children play with wooden Montessori toys under careful screen time limits, protected by parents who understand the digital machinery they've built. Meanwhile, across America, children's vulnerabilities are tracked, quantified, and monetized by that same machinery. This isn't just hypocrisy &#8212; it's a profound moral failure that demands an equally profound legislative response. </p><p>Moving from righteous indignation to meaningful change, however, requires overcoming the very obstacles Meta has so carefully constructed.</p><h2><strong>What Comes Next?</strong></h2><p>Senator Durbin drew a powerful historical parallel after Wynn-Williams mentioned how Meta execs&#8217; children play with &#8220;Wooden Montessori Toys.&#8221; Decades ago, when fighting the tobacco industry, Durbin relates how the children of tobacco executives became unexpected allies in the battle for regulation. These children, confronted with evidence of their parents' complicity in widespread harm, began questioning: "Dad, tell me that you aren't part of that tobacco conspiracy to keep the truth from the American people," as Durbin said.</p><p>Mark Zuckerberg&#8217;s eldest child is 9 years old. As <a href="https://www.afterbabel.com/p/social-media-mental-illness-epidemic">kids&#8217; mental health has continued to plummet</a>, as misinformation &#8211; accelerated by generative AI &#8211; spreads online like kudzu, and with platforms like Meta serving as a handmaiden to authoritarianism, I&#8217;m not sure we can wait for that scenario. And as riveting as this hearing was, a similar hearing just over a year ago &#8211; <a href="https://thehill.com/policy/technology/4440060-zuckerberg-apologies-social-media-victims-meta-senate-hearing/">when Senator Hawley invited Zuckerberg to apologize to the parents in the room</a> &#8211; left the impression that we are on the cusp of action, and as I documented previously, nothing happened.<br><br>So, while Senator Durbin&#8217;s parallel is hopeful, it also exposes a systemic leadership vacuum. Getting a bill introduced and holding compelling hearings is simply not enough; this generational issue requires legislative leadership that rises above the standard, and we need leaders willing to confront the machinery that protects the privileged while exploiting the public.<br><br>In the 1980s, against all odds, Senator Bill Bradley led an unlikely, bipartisan coalition in reforming America's tax code by refusing to accept the status quo, proving that one determined lawmaker can drive monumental change. Working both chambers of Congress with relentless persistence, Bradley built the coalition necessary to eliminate cherished tax loopholes while lowering overall rates&#8212;a feat policy experts had dismissed as politically impossible. Bradley's crusade demonstrates that true policy transformation requires not just good ideas and business-as-usual, but a champion willing to invest years of political capital in making the unthinkable become reality. <br><br>This was a compelling hearing, but was only a start to what&#8217;s necessary to meet the moment, because challenges lie ahead. <br><br>To wit:</p><ol><li><p><em>Beware the bad faith &#8220;Free Speech champions&#8221; creeping in the shadows.</em> The Kids Online Safety Act, as I detailed in my After Babel post, came a long way last year before, as Senator Blackburn herself put it, &#8220;Meta spent millions of dollars lobbying against us and against the legislation,&#8221; and I&#8217;m hopeful it will come back. But already First Amendment maximalists &#8211; people who in essence argue that nearly all human activity is speech, and <a href="https://www.theatlantic.com/ideas/archive/2023/12/netchoice-v-bonta-california-case-social-media-children/676351/">any regulation that could conceivably relate to speech, is unconstitutional</a> &#8211; unsurprisingly have issues with the legislation that could stop it. Of course, these folks don&#8217;t show up with suggestions on how to address their concerns; they just say &#8216;no.&#8217; Lawmakers need a plan for dealing with these people, and to my knowledge, they don&#8217;t have one. And even good faith objections get lost in the mess and make this dynamic worse, often because the good faith objectors also fail to offer compromise alternatives. Both Republicans and Democrats fight each other over censorship and the First Amendment &#8211; can lawmakers like Senators Hawley and Klobuchar, for example, set aside their differences enough to sort through this issue, protect the First Amendment, and get something meaningful through?</p></li><li><p><em>Tech oligarchs in positions of power defending their personal interests.</em> Men like Marc Andreesen &#8211; on the Meta board, a distressingly cynical and corrupt figure in Wynn-Williams book, <a href="https://a16z.com/announcement/investing-in-character-ai/">an investor</a> <a href="https://futurism.com/character-ai-pedophile-chatbots">in horrific apps like CharacterAI</a>, and a prominent donor and adviser to the Trump administration &#8211; will likely not accept any law passing that impacts his personal investments, and yet he wields <a href="https://www.politico.com/news/2024/12/10/tech-billionaires-washington-00193657">enormous influence that he is not afraid to use</a>. Are Republicans and centrist Democrats who have bought into the innovation myths he peddles prepared to resist him and his Silicon Valley friends?</p></li><li><p><em>The President as dealmaker and goalkeeper.</em> As both sides of the aisle should be able to agree, President Trump sees himself as a dealmaker, and he frequently surprises both sides with what he believes is a good deal. Are legislators thinking about there something Meta and Zuckerberg could offer him that would block their shot? Are they ready to make the investment in convincing the President that the best deal for America is one that protects kids and neuters Meta&#8217;s power? This is existential for any legislative effort on technology, whether KOSA or otherwise, because Zuckerberg and Meta and other tech titans have already shown they have no red lines in what they are prepared to offer governments to make money (see, e.g., the hearings&#8217; discussion on Meta, Zuckerberg, and China).</p></li></ol><p>While watching this hearing, I felt that familiar rush of a pivotal courtroom scene from one of those 1990s legal dramas I mentioned at the beginning &#8211; the whistleblower testimony, the damning evidence, the righteous indignation of lawmakers. But unlike a John Grisham adaptation, resolution doesn&#8217;t arrive with the credits rolling after the dramatic scene. Instead, this compelling hearing is merely the opening scene. </p><div id="youtube2-2sLcfQKU_co" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;2sLcfQKU_co&quot;,&quot;startTime&quot;:&quot;274&quot;,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/2sLcfQKU_co?start=274&amp;rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>The harder, less cinematic work comes next. Now we need our Bill Bradley &#8211; someone willing to wage the unglamorous, months-long campaign through committee rooms and backroom negotiations, someone who can withstand tech&#8217;s lobbying blitz and the inevitable distractions. Senator Hawley's promised bipartisan investigation is promising, but without that Bradley-like figure to carry the fight beyond the dramatic hearings and through to actual legislation, we're left with captivating political theater that changes nothing. The question isn't who can deliver the powerful testimony or ask the incisive questions &#8212; we've seen that on display. The question is: who has the stamina, strategic mind, and conviction to see this through when the cameras are off and the hard work begins? </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Innovation and Myth]]></title><description><![CDATA[How America&#8217;s tech oligarchs are all hat and no cattle]]></description><link>https://www.tomorrowsmess.com/p/innovation-and-myth</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/innovation-and-myth</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Mon, 24 Mar 2025 19:14:50 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When Chinese company DeepSeek released its R1 large language model (LLM) in January &#8211; which performs similarly to American generative AI systems at a fraction of the cost &#8211; venture capitalist Marc Andreesen called it &#8220;<a href="https://fortune.com/2025/01/27/marc-andreessen-deepseek-sputnik-ai-markets/">AI&#8217;s Sputnik moment</a>,&#8221; implying that the United States is getting left behind in the AI innovation race.&nbsp;</p><p>Andreessen has spent most of the last year <a href="https://techcrunch.com/2024/12/14/why-marc-andreessen-was-very-scared-after-meeting-with-the-biden-administration-about-ai/">railing against the Biden administration&#8217;s approach to AI regulation</a> and <a href="https://www.businessinsider.com/donald-trump-vcs-in-silicon-valley-2024-7?op=1">antitrust enforcement</a>, and he claims that he and his venture capital firm, Andreessen Horowitz, are only involved in politics to stand up for &#8220;Little Tech.&#8221; Little Tech, <a href="https://a16z.com/the-little-tech-agenda/">according to Andreessen</a>, drives innovation, competition, and growth, and he associates America&#8217;s military and technological dominance to advances made by innovative start-ups. Of course, his firm has invested heavily in AI start-ups, so he has a financial interest in promoting this narrative. With the federal government under Trump and Biden scrutinizing mergers in the tech sector more closely than it did under Obama, it&#8217;s more difficult to cash out those investments while valuations are sky high.&nbsp; </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>And yet, if you scrutinize Andreessen&#8217;s AI investments more closely, it&#8217;s not at all clear that, even if DeepSeek was a Sputnik moment, that Andreessen holds the recipe for policy that will continue America&#8217;s technological dominance. For example, Andreessen Horowitz <a href="https://www.reuters.com/technology/ai-chatbot-characterai-with-no-revenue-raises-150-mln-led-by-andreessen-horowitz-2023-03-23/">invested $150 million in May 2023 in CharacterAI</a>, a start-up founded by former engineers from Google. At the time, CharacterAI had no revenue, and this investment valued the company at a billion dollars. Yet, is CharacterAI doing any of the things that Andreessen and his fellow tech oligarchs claim their tech products will eventually do if they are left alone by government, like curing cancer, solving climate change, making American workers more productive and American businesses more competitive, raising quality of life, or ensuring America remains technologically and militarily dominant on the global stage? No, CharacterAI hosts chatbots that <a href="https://futurism.com/character-ai-pedophile-chatbots">use pedophilic grooming techniques</a> with underage users, encourage them to <a href="https://www.cbsnews.com/news/florida-mother-lawsuit-character-ai-sons-death/">harm themselves</a> and to <a href="https://www.cnn.com/2024/12/10/tech/character-ai-second-youth-safety-lawsuit/index.html">kill their parents</a>. Even <a href="https://a16z.com/announcement/investing-in-character-ai/">Andreessen Horowitz&#8217;s own business case for its investment is damning</a> in itself if you understand the implications (emphasis added): </p><blockquote><p>Character.AI has trained their proprietary LLM from scratch, enabling their product to optimize for not only raw intelligence, but also a conversational empathy that <em><strong>captures and holds the consumer&#8217;s attention</strong></em> with humor, emotion, insight, and more. As our partners have written about extensively, we believe that end-to-end apps such as Character.AI, which both control their models and own the end customer relationship, have a tremendous opportunity to generate market value in the emerging AI value stack. <em><strong>In a world where data is limited, companies that can create a magical data feedback loop </strong></em>by connecting <em><strong>user engagement</strong></em> back into their underlying model <em><strong>to continuously improve their product</strong></em> will be among the biggest winners that emerge from this ecosystem.</p></blockquote><p>In other words, CharacterAI is worth billions without revenue because it uses engagement techniques borrowed from social media to maximize user time spent on the product, which generates high-quality data that can be used to train the next generation of its product or licensed to other AI companies for training. Andreesen Horowitz <a href="https://a16z.com/announcement/investing-in-character-ai/">brags in its CharacterAI investment announcement</a> that users spend on average two hours with the product every day; it hid the fact that overwhelmingly, those users are children, and that <a href="https://futurism.com/character-ai-google-test-ai-chatbots-kids">the Google engineers who helped develop the model warned of its dangers</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!72Pa!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!72Pa!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 424w, https://substackcdn.com/image/fetch/$s_!72Pa!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 848w, https://substackcdn.com/image/fetch/$s_!72Pa!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 1272w, https://substackcdn.com/image/fetch/$s_!72Pa!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!72Pa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic" width="960" height="403" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:403,&quot;width&quot;:960,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:18818,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/159769106?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!72Pa!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 424w, https://substackcdn.com/image/fetch/$s_!72Pa!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 848w, https://substackcdn.com/image/fetch/$s_!72Pa!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 1272w, https://substackcdn.com/image/fetch/$s_!72Pa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0ee32550-c12e-4ab4-8122-a6dd5e017bd6_960x403.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Marc Andreessen&#8217;s vision of innovation: CharacterAI chatbots that target underage users to harvest their data</figcaption></figure></div><p>Is this innovation? I think not; this is instead just a repackaged, for the AI age, social media business model &#8211; which, even though that model has made Andreessen a very rich man (he is on the board of Meta), has arguably failed America and its youth. </p><p>That brings us to today's topic: innovation as myth. Republicans and Democrats don&#8217;t agree on much these days, but you&#8217;d be hard-pressed to find any political figure in either party who opposes the idea that we should be encouraging innovation. But what does that mean?&nbsp;</p><p>In my previous post, I introduced <a href="https://www.tomorrowsmess.com/p/odd-bedfellows">the core beliefs that define the ideology of the tech elite </a>and described how the values and blind spots of today's tech titans shape our collective future. Today, I will unpack how Silicon Valley and elite tech culture have transformed our understanding of innovation itself, creating a powerful mythology that obscures the question of who truly benefits from this vision of progress.</p><p>The stakes are about more than technology; it's about questioning the stories told by those who most profit from self-spun myths about entrepreneurism and innovation, and about shaping a future where strategic, intentional innovation makes Americans stronger and more prosperous.</p><div><hr></div><p>The tech world's most celebrated figures have masterfully woven personal mythologies that transform them from shrewd businessmen into cultural archetypes of innovation. Elon Musk is an exemplar, casting himself as a modern Thomas Edison despite primarily acquiring rather than inventing his most successful ventures. Tesla's astronomical market valuation &#8212; consistently defying traditional metrics when compared to established automakers like Ford or Toyota &#8212; derives not from its manufacturing efficiency or long-term profitability, but from the premium investors place on Musk's carefully cultivated persona as a technological visionary. </p><p>The "<a href="https://fortune.com/2025/03/11/musk-premium-influences-tesla-worth-cfo/">Musk premium</a>" represents billions in market capitalization built not on tangible innovation but on narrative, which is a testament to how effectively he's merged his personal brand with the concept of innovation itself. Musk is not alone; whether it&#8217;s Bill Gates or Mark Zuckerberg or Marc Andreessen, all have created mythologies for themselves that they carefully manage and maintain that cede to these men the right to define not only what innovation is good for America, but what is innovative in the first place.</p><p>What's most remarkable about these innovation myths is their extraordinary return on investment. For the price of bombastic tweets, contrarian blog posts, and carefully managed media appearances, these tech figures have secured cultural capital that translates directly into financial might. When Musk tweets about Mars colonization, Tesla shares climb despite the absence of any actual connection to the company's automobile business. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lFGp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lFGp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 424w, https://substackcdn.com/image/fetch/$s_!lFGp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 848w, https://substackcdn.com/image/fetch/$s_!lFGp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 1272w, https://substackcdn.com/image/fetch/$s_!lFGp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lFGp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic" width="588" height="482" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:482,&quot;width&quot;:588,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:27708,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/159769106?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!lFGp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 424w, https://substackcdn.com/image/fetch/$s_!lFGp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 848w, https://substackcdn.com/image/fetch/$s_!lFGp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 1272w, https://substackcdn.com/image/fetch/$s_!lFGp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff3e67e6c-faed-46d8-91a7-52acb54a6ca4_588x482.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Trading in Tesla stock increased in the days following this tweet, and slowed a decline in the share price.</figcaption></figure></div><p>This phenomenon creates a self-reinforcing cycle where tech oligarchs&#8217; pronouncements shape markets, which in turn validates their status as visionaries, all without requiring the messy, time-consuming work of producing genuine technological breakthroughs that improve human capability. The myth of innovation has, ironically, become more valuable than innovation itself.</p><p>Yet innovation is not just about creating new products or technologies to make a buck; it is about transforming ideas into tangible improvements that meet the nation&#8217;s needs and aspirations. True innovation should solve real problems, enhance human capabilities, and foster economic progress. This definition implies a broader and more meaningful scope than the often narrow, short-term profit-driven interpretations of innovation prevalent today.</p><p>This narrow vision of innovation emerged roughly contemporaneously with the emergence of tech culture, both as a goal in itself and as a way of rationalization and justification. But it wasn&#8217;t always this way; if you set aside the mythology of Silicon Valley and compare the past half-century to the half-century that preceded it, the United States has seen a significant slowdown in innovation. </p><div><hr></div><p>The grand dreams of the space age, where humanity&#8217;s future seemed boundless, have largely fizzled out into repackaged older inventions. The speed record for human travel &#8211; 24,790 miles per hour &#8211; set by Apollo 10's reentry to Earth&#8217;s atmosphere in 1969, remains unbroken; the speed of commercial airliners has actually decreased. U.S. forces withdrew from Kabul the same year that the AK-47 &#8211; favored weapon of the mujahideen &#8211; had its 73rd birthday. Even the drones changing the nature of the battlefield in Ukraine are not much more than model airplanes with radios and bombs strapped to them. Rovers that the U.S. has sent to Mars and, more recently, by China to the dark side of the moon (which, by the way, the U.S. has not done), are not very far ahead of technology that was invented around the time that Dylan went electric, and the rockets that launched them are not much more than incremental improvements over the design of the Nazi V-2 bombs that terrorized London beginning in 1944.</p><p>Neal Stephenson, science fiction writer and former advisor to Jeff Bezos&#8217;s rocket company, Blue Origin, wrote in a 2011 essay on American technological stagnation, that modern American corporate cultures stifle truly innovative ideas. He described engineers brainstorming new concepts only to be shut down by someone who quickly discovers a vaguely similar, previous attempt, that either failed (and thus is not worthy of further exploration) or succeeded (meaning that first-mover advantage is lost). Managers, fearing job loss, avoid risky projects, preferring the safety of incremental improvements.&nbsp;</p><p>&nbsp;According to Stephenson:</p><blockquote><p>There is no such thing as &#8220;long run&#8221; in industries driven by the next quarterly report. The possibility of some innovation making money is just that &#8211; a mere possibility that will not have time to materialize before the subpoenas from minority shareholder lawsuits begin to roll in.</p></blockquote><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!T3jX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!T3jX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 424w, https://substackcdn.com/image/fetch/$s_!T3jX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 848w, https://substackcdn.com/image/fetch/$s_!T3jX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 1272w, https://substackcdn.com/image/fetch/$s_!T3jX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!T3jX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic" width="800" height="500" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/def8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:500,&quot;width&quot;:800,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:51389,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/159769106?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!T3jX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 424w, https://substackcdn.com/image/fetch/$s_!T3jX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 848w, https://substackcdn.com/image/fetch/$s_!T3jX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 1272w, https://substackcdn.com/image/fetch/$s_!T3jX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdef8626e-b4b6-4fd9-9bec-44e49002ba69_800x500.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">In the &#8216;80s, flying cars powered by fusion by 2015 seemed realistic; yet it&#8217;s 2025 and we still don&#8217;t even have the self-driving taxis Elon Musk has promised for a decade.</figcaption></figure></div><p>This aversion to long-term investment in groundbreaking technology stems in part from changes in the tax code during the 1970s and 1980s, which produced weaker incentives for the private sector to invest in long-term research and development compared to previous decades. But it also stems from a change in direction from the White House. After the U.S. had vanquished the Soviet Union in the Space Race, the Nixon administration, buffeted by unrest domestically over civil rights, Vietnam, and stagflation, pivoted to a less expensive way of winning the Cold War: bringing the USSR to its knees economically by instead forcing it to compete on consumer technology development, enhancing the quality of life for American &#8211; and European &#8211; citizens.&nbsp;</p><p>This is an approach to technology development that has persisted to today. The most successful tech businessess since the late-1990s dot-com bust have two things in common: (1) they make their money from advertising or facilitating the sale of consumer products; (2) their innovations have more to do with regulatory arbitrage and profit optimization than producing groundbreaking technological advancements. </p><p>Consider Amazon, whose rise from an online bookstore to a global retail colossus is often framed as a tale of relentless innovation. Yet for most of its history Amazon was simply a mail order catalog that just happened to be on the internet, with Jeff Bezos's real innovation being the evasion of sales tax to undercut traditional retailers. Even Amazon Prime&#8217;s lure of next-day delivery is not a marvel of modern science but a feat of ruthless operational efficiency, including once having been based on <a href="https://www.wired.com/story/meet-camperforce-amazons-nomadic-retiree-army/">senior citizens living out of RVs working low wage jobs in warehouses</a>, and still today on <a href="https://www.newyorker.com/books/under-review/seasonal-associate-is-a-labor-memoir-for-the-amazon-era">a Dickensian practice </a>of denying workers and delivery drivers bathroom breaks.&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zoX-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zoX-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!zoX-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!zoX-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!zoX-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zoX-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:335084,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/159769106?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zoX-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 424w, https://substackcdn.com/image/fetch/$s_!zoX-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 848w, https://substackcdn.com/image/fetch/$s_!zoX-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 1272w, https://substackcdn.com/image/fetch/$s_!zoX-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2283daed-d932-453b-93b3-956579d9cdc3_1920x1080.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The book and Oscar-winning film <em>Nomadland</em> is based on a real program Amazon ran with senior citizens doing hard labor in its warehouses while living in RVs in the parking lot.</figcaption></figure></div><p>Facebook &#8211; now Meta &#8211; started as a digital yearbook for college students (most definitely not a new invention) and many of its features that it hawks as &#8220;making the world more connected&#8221; are simply a front for gathering data and monetizing attention. And while Google&#8217;s search algorithm was truly innovative at the time it was released, the company's imperative to sell ads has led to prioritization of paid content over organic discovery and to a diminishment of its utility, such that users now share tricks on blogs and Reddit about how to access stripped down versions of Google search where results aren&#8217;t polluted by sponsored content and dubious AI-generated summaries. </p><p>At least Facebook and Google figured out how to monetize their products, unlike WeWork &#8211; a real estate company masquerading as an innovative tech firm, complete with an app and free kombucha in the kitchen &#8211; or Theranos, which pushed an illusion of innovation to criminal extremes without ever delivering a viable product. Yet both Elizabeth Holmes of Theranos and WeWork&#8217;s Adam Neumann were lauded as innovative sages without much scrutiny. Holmes is now serving a prison sentence.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!oBEc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!oBEc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 424w, https://substackcdn.com/image/fetch/$s_!oBEc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 848w, https://substackcdn.com/image/fetch/$s_!oBEc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 1272w, https://substackcdn.com/image/fetch/$s_!oBEc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!oBEc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic" width="1200" height="1575" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1575,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:151032,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/159769106?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!oBEc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 424w, https://substackcdn.com/image/fetch/$s_!oBEc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 848w, https://substackcdn.com/image/fetch/$s_!oBEc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 1272w, https://substackcdn.com/image/fetch/$s_!oBEc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1b4161b-05bf-476b-b01b-4c36b1954294_1200x1575.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">&#8220;I don't want to make an incremental change in some technology in my life. I want to create a whole new technology . . . .&#8221; Elizabeth Holmes on innovation in Forbes Magazine, 2014.</figcaption></figure></div><p>As the novelty of these companies&#8217; first wave products waned, they began shipping repackaged versions of what they already had produced. The regular drip-drip-drip of incremental improvements in phones, tablets, e-readers are covered fawningly by a uncritical industry press, while most of us have had the frustrating experience of buying a new device only to need new cords and chargers, or of having to pay continuing subscriptions for &#8220;digital services&#8221; that used to just be one-time software purchases. </p><p>Even these companies&#8217; much lauded "moonshots" &#8211; projects like internet by balloon, drone delivery, or self-driving cars &#8211; have either mostly failed, are close to failure, or have been delayed repeatedly, never leaving &#8216;pilot&#8217; status, revealing themselves to be more performative stunts than practical. And even then, none of these were really new inventions or pushed the boundaries of what&#8217;s possible; they only pushed the boundaries of how a company could spend money. There&#8217;s a big difference between taking a performative shot at the moon and actually landing on it and coming home.&nbsp;</p><p>As the world enters the AI age, the dominant tech players&#8217; investment and innovation strategies have not changed. The landmark 2017 paper, &#8220;<a href="https://arxiv.org/abs/1706.03762">Attention is all you need</a>,&#8221; which sparked the current boom in generative LLMs, was the product of major technology companies&#8217; material interest in solving specific problems with natural language processing in machine learning. (The paper was published by eight scientists at Google, including Noam Shazeer, who went on to found CharacterAI after safety concerns were raised internally about his product.) Because natural language processing is a cornerstone of human interaction with the digital products that these companies offer, not to mention helpful at improving the quality of search and targeted advertising core to their revenue streams and crucial to things like digital voice assistants, which were a fixation of the market in the mid-2010s (Siri was released after an Apple acquisition in 2011, Amazon&#8217;s Alexa in 2014, Google Assistant in 2016), major technology companies dedicated a significant amount of resources to acquiring the talent that could focus on research that seems abstractly academic on the surface but was targeted to these companies&#8217; narrow business interests at the time.</p><p>What was certainly <em>not</em> in Google&#8217;s, Amazon&#8217;s, Meta&#8217;s, or Microsoft&#8217;s business interest during the lead up to &#8220;Attention is all you need,&#8221; or even in the years since, is building a technology that can assist the United States in winning a great power competition with geopolitical adversaries, curing cancer, solving climate change, or any of the other convenient reasons tech executives cite today for why they can&#8217;t abide a more competitive market or be subject to accountability.&nbsp;I will go deeper into the business model of LLMs in a future post, but it&#8217;s important to note for today that a common issue with most LLM applications is that they all are solutions in search of a problem. Chatbot apps like ChatGPT or CharacterAI are byproducts of these companies&#8217; search to entrench their businesses and improve their profits on their existing products, rather than purpose-built to improve lives or make the U.S. more competitive internationally. </p><div><hr></div><p>In July 1959, the United States and the Soviet Union clashed over cutting edge technology and ideology. Yet this flashpoint was not a showdown over tanks, missiles, or spy planes in Germany or Korea, but in a model kitchen in Moscow. Vice President Richard Nixon traveled to Moscow for the opening of a U.S. exhibit, a showcase of consumer goods from over 450 American companies. Nixon and Soviet Premier Nikita Khrushchev&#8217;s spirited debates throughout the visit highlighted their competing visions of prosperity and technological advancement. This scene, known famously as &#8220;The Kitchen Debate,&#8221; unfolded inside a $14,000 American home, replete with modern appliances like a dishwasher, refrigerator, and stove &#8211; a testament to the average American worker's attainable lifestyle.</p><div id="youtube2--CvQOuNecy4" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;-CvQOuNecy4&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/-CvQOuNecy4?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>I showed a clip of the Nixon-Khruschev debate to a class of graduate students that I teach, and the entire class erupted in laughter at this line of Nixon&#8217;s:</p><blockquote><p>There are some instances where you may be ahead of us. For example, in the development of the thrust of your rockets for the investigation of outer space. There may be some instances, for example, color television, where we're ahead of you.</p></blockquote><p>We know with the benefit of hindsight that it&#8217;s laughable to make equivalent reaching for the stars and reaching for the remote, just as it should be ludicrous to consider using AI to cure cancer and using AI to create pedophilic chatbots equivalent pursuits. And yet, not even Nixon would have argued that by incentivizing the invention of better consumer products like color television would we end up with spaceflight, but that&#8217;s exactly the argument Marc Andreessen and Sam Altman are making today, and which policymakers are largely accepting unquestioned, all because of their reputation as men of innovation.</p><p>The argument that &#8220;we can&#8217;t slow down innovation&#8221; is a convenient way to sidestep the more critical questions of &#8220;What should we be innovating, how, and why? Who is best incentivized to innovate? How do we most efficiently generate the innovations America needs?&#8221; The answer, when it comes to ensuring America maintains its global lead, may not actually be the market; that&#8217;s not what markets are for.&nbsp;</p><p>As the U.S. stands at a crossroads similar to that of the 1950s and 1960s, when it had to choose between focusing only on market-driven consumer technology or also investing in state-driven scientific research, it must now decide how to navigate the new landscape of AI and technology in the face of a rising China. We can have both, but the latter requires hard work and not buying every argument about innovation that self-interested tech companies are selling. The decision will determine not just the economic future of the nation but its position on the global stage and its ability to address the profound challenges of our time.</p><p>Selling ads, tax avoidance, regulatory arbitrage, planned obsolescence, and hoping consumers forget to cancel subscriptions, while certainly legal, often wildly profitable, and perhaps laudable as innovative <em>revenue strategies</em> are decidedly not likely to produce <em>innovations</em>. In defining and pursuing true innovation, policymakers must look beyond the immediate profits and conveniences offered by today&#8217;s tech oligarchs.  <br>Only by asking the right questions and setting clear, purposeful goals can the United States ensure that its place in the world through the 21st century surpasses that of its post-war ascendancy, when we not only raised the standard of living throughout the entire Western world, but also split the atom, walked on the moon, and began looking to the stars. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading tomorrow's mess! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Odd Bedfellows]]></title><description><![CDATA[Circuit boards, free love, and the origin of an ideology]]></description><link>https://www.tomorrowsmess.com/p/odd-bedfellows</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/odd-bedfellows</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Fri, 14 Mar 2025 19:38:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!jjaS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24e0aa58-0fa4-46e5-a626-c4ddea20b5f0_900x606.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In the summer of 1967, thousands of young people descended on San Francisco's Haight-Ashbury district for what would become known to history as the "Summer of Love." They were rejecting consumerism, questioning authority, and imagining alternative ways of organizing society. At the same time, just forty miles south, in what would soon be called Silicon Valley, engineers and entrepreneurs were laying the groundwork for a technological revolution.</p><p>These two movements&#8212;seemingly opposed in their values and visions&#8212;would eventually converge in a way that nobody could have predicted. Their unlikely fusion would create what Richard Barbrook and Andy Cameron later called "<a href="https://monoskop.org/images/d/dc/Barbrook_Richard_Cameron_Andy_1996_The_Californian_Ideology.pdf">The Californian Ideology</a>," a worldview that would evolve further, come to reshape global capitalism and geopolitics, and redefine how Western society thinks about technology and innovation itself.</p><p>Consider Stewart Brand. In 1968, Brand published the first Whole Earth Catalog, a countercultural bible that Steve Jobs would later describe as "Google in paperback form."<a href="https://www.newyorker.com/news/letter-from-silicon-valley/the-complicated-legacy-of-stewart-brands-whole-earth-catalog"> The catalog embodied the DIY, anti-establishment ethos of the time</a>, featuring tools for sustainable living and consciousness expansion. Brand's slogan&#8212;"We are as gods and might as well get good at it"&#8212;captured the audience&#8217;s techno-optimism and self-determination.</p><p>Fast forward to 1985, and Brand was helping to launch the WELL (Whole Earth 'Lectronic Link), one of the first influential online communities. The countercultural icon had found a new home in emerging technology. Brand wasn't alone in making this transition. Many former hippies who had rejected corporate America were founding technology companies, and Gen-Xers who had grown up on cautionary tales of selling out (think of Winona Ryder&#8217;s dilemma in "<a href="https://www.youtube.com/watch?v=xDYGo0UgIVM">Reality Bites</a>") were discovering that capitalism wasn't so bad if you could do it in jeans while talking about making the world a better place.</p><p>This isn't just coincidence. There's something psychologically compelling about this fusion of countercultural ideals and entrepreneurial ambition. It allows for a uniquely satisfying narrative: you can pursue wealth and power while still seeing yourself as a rebel fighting the system. You're not a corporate suit; you're a disruptor. You're not crassly seeking profit; you're building the future. </p><p>This narrative has proven irresistible<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> to tech elites like Marc Andreessen, who co-created the first widely-used web browser, and Peter Thiel and Elon Musk, who helped build PayPal. Each has positioned himself as an outsider battling entrenched interests, even as they've accumulated billions in personal wealth and unprecedented influence over our politics and our daily lives.</p><div class="image-gallery-embed" data-attrs="{&quot;gallery&quot;:{&quot;images&quot;:[{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/24e0aa58-0fa4-46e5-a626-c4ddea20b5f0_900x606.jpeg&quot;},{&quot;type&quot;:&quot;image/jpeg&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1f2858b9-7a7d-4da8-b92f-7f092429afbc_2000x1275.jpeg&quot;}],&quot;caption&quot;:&quot;Without Kesey&#8217;s Merry Pranksters, we wouldn&#8217;t have Thiel and Musk &#8212; and it&#8217;s not just the drugs&quot;,&quot;alt&quot;:&quot;Ken Kesey / Peter Thiel and Elon Musk&quot;,&quot;staticGalleryImage&quot;:{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cda9f077-a8bd-4990-b0b2-6bb7262c202a_1456x720.png&quot;}},&quot;isEditorNode&quot;:true}"></div><p>But what exactly are the components of this ideology? What are its fundamental beliefs, and how did they come to dominate not just Silicon Valley but increasingly the global economy?</p><h2><strong>From ideology to product to policy</strong></h2><p>At least six core beliefs define the ideology of the tech elite:</p><ol><li><p>Men should be judged by their creations, not their credentials</p></li><li><p>Optimization is a valorous end in itself</p></li><li><p>Technology is humanity&#8217;s salvation</p></li><li><p>Establishment institutions deserve to be disrupted</p></li><li><p>Information (and your data) want to be free</p></li><li><p>Market value is the ultimate source of truth</p></li></ol><p>These aren't just ideas, but they shape how decisions are made, products are built, and success is defined. And perhaps most importantly, these have become the scaffolding on which tech policies in the US have been built&#8211;even though I suspect you&#8217;d be hard pressed to find an American who has cast ballots in the hopes that these beliefs would dictate how our country is run.</p><p>I&#8217;ll briefly introduce each of these concepts here, and more in-depth posts on each will follow in the weeks ahead.</p><h3><strong>Men should be judged by their creations, not their credentials</strong></h3><p>In 2004, a 19-year-old Mark Zuckerberg famously dropped out of Harvard to pursue his fledgling company, Facebook. This was the perfect embodiment of Silicon Valley's meritocratic ideal: what matters isn't your degree but what you can build. Peter Thiel took this ethos to its logical conclusion when he established the Thiel Fellowship in 2010, offering $100,000 to young people willing to drop out of college and &#8220;build things&#8221; rather than &#8220;sit in a classroom.&#8221; </p><p>There's something democratic and appealing about this idea. But here's the paradox: this supposedly credential-free builder meritocracy has consistently favored people who had privileged access to technology when it was prohibitively expensive&#8212;predominantly young, white men<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> from upper-middle-class backgrounds who attended elite institutions before dropping out. </p><p>The discounting of credentials and expertise in favor of builders has also meant favoring engineers whose approach to societal problem-solving may not be compatible with the needs of a democratic polity with diverse constituencies. As Christine Rosen put it, <a href="https://hedgehogreview.com/issues/america-on-the-brink/articles/technosolutionism-isnt-the-fix">technosolutionism is not the answer</a>: </p><blockquote><p>Technosolutionism is a way of understanding the world that assigns priority to engineered solutions to human problems. Its first principle is the notion that an app, a machine, a software program, or an algorithm offers the best solution to any complicated problem. Notably, the technosolutionist&#8217;s appeal to technical authority, even for the creation of public policy or public health measures, is often presented as apolitical, even if its consequences are often not. Technosolutionism speaks in the language of the future but acts in the short-term present. In the rush to embrace immediate technological fixes, its advocates often ignore likely long-term effects and unintended consequences.</p></blockquote><p>Devaluing wisdom in favor of &#8220;building&#8221; has had far reaching consequences: the ascendancy of tech founders in the public sphere over the last twenty years has contributed to <a href="https://www.theatlantic.com/ideas/archive/2025/02/career-civil-servant-end/681712/">the death of expertise</a>, as Tom Nichols of <em>The Atlantic </em>has called it, and the hollowing out of that expertise in key federal government positions in Trump&#8217;s first term led Michael Lewis to write <em><a href="https://wwnorton.com/books/the-fifth-risk">The Fifth Risk</a></em>. </p><h3><strong>Optimization is a valorous end in itself</strong></h3><p>In 2018, a story broke that <a href="https://www.theverge.com/2018/2/1/16958918/amazon-patents-trackable-wristband-warehouse-employees">Amazon had patented a wristband</a> that could track warehouse workers' hand movements in real-time. The device would vibrate to nudge workers' hands in the right direction if they reached for the wrong item, and it would record precisely how long it took to complete each task. This wasn't science fiction&#8212;it was the logical endpoint of a culture where optimization is seen not just as a tool, but as a moral imperative.</p><p>The optimization obsession traces back to Frederick Taylor and his stopwatch-wielding efficiency experts of the early 20th century, but Silicon Valley has taken it to an extreme. Amazon's fulfillment centers represent perhaps the purest expression of this mindset: every movement is measured, every second accounted for, every process continuously refined in pursuit of maximum efficiency.</p><p>Think about the language tech companies use. They don't just improve things; they "optimize" them. They don't solve problems; they create "optimal solutions." The engineering mindset&#8212;where everything can and should be measured, analyzed, and improved&#8212;has escaped the confines of technical systems and been applied to human behavior, social interactions, and even love (just look at dating apps' matchmaking algorithms).</p><p>But what happens when this optimization imperative collides with human needs? Amazon's warehouse workers have been increasingly replaced by robots because they developed repetitive stress injuries from maintaining the "optimal" pace &#8211; not to mention needing to take <a href="https://www.mirror.co.uk/news/uk-news/timed-toilet-breaks-impossible-targets-11587888">bathroom breaks</a> or take shelter <a href="https://abcnews.go.com/Business/dead-amazon-facility-tornado-pummels-illinois/story?id=81721932">during a tornado</a>. Driver-routing algorithms maximize efficiency but <a href="https://newjersey.news12.com/warehouse-trucks-are-causing-traffic-issues-in-some-towns-a-state-senate-bill-seeks-to-change-that">clog</a> <a href="https://www.khou.com/article/news/local/amazon-semi-trucks-timberwood-neighborhood/285-bc47c29a-24b8-4ff4-bca6-0bca5242cea6">once-quiet neighborhoods</a>. And it&#8217;s not just Amazon: YouTube's recommendation engine, like those of most social media products, is optimized for engagement but ends up <a href="https://policyreview.info/articles/analysis/recommender-systems-and-amplification-extremist-content">promoting increasingly extreme content</a>. And yet this same approach is <a href="https://www.nbcnews.com/politics/doge/federal-workers-agencies-push-back-elon-musks-email-ultimatum-rcna193439">the one Musk&#8217;s DOGE is explicitly taking with the federal government workforce</a>.</p><p>Of course, if you&#8217;re an outsider to this ideology, it&#8217;s clear that not everything that could be optimized ought to be. Human communities, democratic deliberation, and cultural expression don't necessarily improve when optimized for efficiency. Some values&#8212;like justice, beauty, or dignity&#8212;don't lend themselves to algorithmic optimization at all.</p><h3><strong>Technology is humanity&#8217;s salvation </strong></h3><p>When Marc Andreessen proclaimed that "software is eating the world" in a 2011 Wall Street Journal op-ed, he wasn't just making an observation about market trends. He was articulating a worldview where technology, not politics or social movements, drives human progress, which he reaffirmed in his bizarre, self-published 2023 <a href="https://a16z.com/the-techno-optimist-manifesto/">Techno-Optimist Manifesto</a>.</p><p>This belief in technological determinism can be traced back to figures like Alan Kay, who worked at Xerox PARC in the 1970s and famously said, "The best way to predict the future is to invent it." It's a seductive idea: rather than getting bogged down in messy political processes, we can directly engineer a better future through technological innovation.</p><p>But a sincere wish to build a better future has morphed into something more millenarian. Consider Elon Musk's Mars obsession. When he talks about making humanity "multiplanetary," he isn&#8217;t just describing a scientific endeavor&#8212;he's talking about salvation. "History is going to bifurcate along two directions," <a href="https://mashable.com/article/elon-musk-spacex-mars-top-5-takeaways">Musk has declared</a>. "One path is we stay on Earth forever, and then there will be some eventual extinction event... The alternative is to become a space-faring civilization and a multi-planetary species." Disruption of our very relationship with our home planet isn't just desirable; it's necessary for survival.</p><p>This savior mentality extends to the human body itself. Silicon Valley's obsession with longevity technologies, from Peter Thiel's rumored interest in young blood transfusions (memorably <a href="https://www.theverge.com/2017/5/22/15676696/hbo-silicon-valley-recap-season-4-episode-5-the-blood-boy">satirized</a> in HBO's "Silicon Valley" with <a href="https://www.theverge.com/2017/5/22/15676696/hbo-silicon-valley-recap-season-4-episode-5-the-blood-boy">the character Gavin Belson's "blood boy"</a>) to Google's anti-aging subsidiary Calico, reveals a fundamental discomfort with human limitations. Bryan Johnson's <a href="https://blueprint.bryanjohnson.com/pages/blueprint-protocol">Blueprint protocol</a>, where he spends millions annually measuring and optimizing every bodily function to reduce his biological age, represents this drive taken to its logical conclusion. <a href="https://www.technologyreview.com/2018/03/13/144721/a-startup-is-pitching-a-mind-uploading-service-that-is-100-percent-fatal/">Sam Altman has reportedly invested</a> in a company attempting to preserve brains for future uploading. When death itself is framed as a technical problem waiting for disruption, no institution is sacred.</p><p>What makes this technological salvation narrative so powerful is that it offers a comforting certainty in uncertain times: no matter how dire our problems&#8212;climate change, pandemics, political polarization, the fact that we as humans are all mortal&#8212;a technological solution awaits. This faith provides a convenient escape hatch from having to engage with messy social and political realities. Why fight for climate legislation when Musk will build electric cars and carbon capture machines? Why reform healthcare when we'll soon have AI diagnostics and CRISPR cures? The promise of technological salvation allows tech elites to position themselves as humanity's saviors while opposing anything that does not align with their financial interests. Meanwhile, the social contexts that produce our most pressing problems&#8212;inequality, environmental degradation, democratic erosion&#8212;remain unaddressed, their continuation ensured by the very belief that technology alone will save us from them.</p><h3><strong>Establishment institutions deserve to be disrupted</strong></h3><p>"Move fast and break things." This mantra, embraced as Facebook's official motto until 2014, perfectly encapsulates Silicon Valley's attitude toward existing institutions. The word "disruption"&#8212;originally an academic term from Clayton Christensen's theory of innovation&#8212;became the battle cry of tech entrepreneurs eager to overthrow established industries and institutions.</p><p>The disruption narrative reached its logical conclusion with figures like <a href="https://newrepublic.com/article/180487/balaji-srinivasan-network-state-plutocrat">Balaji Srinivasan</a>, who delivered a notorious speech titled "Silicon Valley's Ultimate Exit," advocating for people to "exit" traditional society altogether and build new digital nations. This isn't just theoretical&#8212;it translates into active resistance against regulation and democratic oversight.</p><p>Consider how Uber deliberately flouted local taxi regulations as it expanded globally, essentially betting that its popularity with consumers would force regulators to adapt. When pressed about the company&#8217;s regulatory concerns, <a href="https://www.vox.com/2014/5/28/11627354/travis-kalanick-uber-is-raising-more-money-to-fight-lyft-and-the">Travis Kalanick famously responded</a>, "We're in a political campaign . . . The candidate is Uber and the opponent is an asshole named Taxi." Or how crypto advocates have promoted their technology as a way to bypass central banks and financial regulators altogether, with <a href="https://www.investopedia.com/terms/g/genesis-block.asp">Bitcoin's genesis block </a>containing a <em>London Times</em> headline about bank bailouts as a built-in critique of the financial system. And of course, <a href="https://abcnews.go.com/US/cfpb-official-testifies-doges-chaotic-attempts-dismantle-agency/story?id=119651855">the DOGE dismantling of entire swaths of the federal government that they&#8217;ve made no effort to try to understand</a> is just the latest and most extreme example of this drive to disrupt no matter what.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!G-Nm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!G-Nm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 424w, https://substackcdn.com/image/fetch/$s_!G-Nm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 848w, https://substackcdn.com/image/fetch/$s_!G-Nm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!G-Nm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!G-Nm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg" width="1245" height="700" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:700,&quot;width&quot;:1245,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!G-Nm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 424w, https://substackcdn.com/image/fetch/$s_!G-Nm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 848w, https://substackcdn.com/image/fetch/$s_!G-Nm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!G-Nm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42f28b04-d5f5-4316-918c-68231d524160_1245x700.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Tech elites&#8217; ideological distaste for institutions has never been more relevant</figcaption></figure></div><p>What's forgotten in this disruptive zeal is that many "disrupted" institutions emerged to solve genuine social problems, and their inefficiencies keep them in balance. The institutions we've built over decades may be imperfect, but they often embody hard-won social compromises that disruptors dismiss at society's peril.</p><h3><strong>Information (and your data) want to be free</strong></h3><p>"Information wants to be free" became a rallying cry for early internet culture after <a href="https://web.archive.org/web/20101226054908/http://www.law.upenn.edu/fac/pwagner/wagner.control.pdf">Stewart Brand popularized the phrase in the 1980s</a>, but the belief has deep roots in early hacker culture, perhaps going back as far as MIT's Tech Model Railroad Club in the late 1950s&#8212;one of the birthplaces of hacking. This belief is behind admirable projects like Wikipedia and open-source software, but also behind more sinister ones like <a href="https://www.bellingcat.com/news/uk-and-europe/2025/01/28/deepfake-porn-sites-link-to-tech-companies/">Mr. Deepfakes</a>.</p><p>In practice, this freedom has been selectively applied. The same companies that benefit from users freely sharing their personal information zealously guard their algorithms and data. Google wants your search history to be "free" for them to use, but their ranking algorithm is a closely-held secret. And "free" information often means unpaid labor and devalued creative work, as everyone from actors to musicians to the <em>New York Times </em>have found in the age of generative AI.</p><p>The evolution of Meta illustrates this contradiction perfectly. What began as a free way to connect with friends transformed into a sophisticated surveillance machine where users "freely" provide valuable data that is definitely not free when Facebook sells ads, and now, that data, along <a href="https://www.wired.com/story/new-documents-unredacted-meta-copyright-ai-lawsuit/">with pirated copyrighted materials</a>, is being used to train their generative AI system, which is offered to the public . . . for free.</p><p>In reality, information is the raw material of this particular builder culture. Engineers who are raised in this ideology and who work at today&#8217;s tech titans aren&#8217;t working with cobbled together circuit boards ordered from a catalog any more, but rather more and more sophisticated ways of crunching as much data as possible to improve predictive algorithms. </p><h3><strong>Market value is the ultimate source of truth</strong></h3><p>Despite their countercultural trappings, Silicon Valley elites have embraced perhaps the most conventional belief of all: market fundamentalism. "The market has spoken" serves as both explanation and justification for outcomes, no matter how disturbing.</p><p>The billions in capital that AI companies with <a href="https://www.wheresyoured.at/longcon/">questionable models of future revenue</a> have attracted have been used in turn to justify the alleged value that their products will provide to humanity. The smoke will produce fire, rather than the other way around. When Elon Musk points to Tesla's market capitalization as evidence of its value to humanity, he's invoking this belief. The circular logic is striking: a company's stock price reflects its true value, and its true value is reflected in its stock price.</p><p>Yet markets routinely fail to price in externalities like environmental damage, social division, or privacy violations. Lately, it has become clear how much of Tesla&#8217;s market cap is derived not from its future value as an automaker &#8211; especially when compared to, say General Motors or Ford &#8211; but rather the Edisonian myths Musk has spun for himself. Meta&#8217;s market value says nothing about its impact on democratic institutions or teenage mental health, save to reflect that to date it&#8217;s largely been exempt from any accountability, much like Purdue Pharma prior to the first opioid settlements. </p><p>By equating market value with social value, tech elites neatly sidestep challenging questions about their products' broader impacts. And like adherents to a prosperity gospel, they believe their massive wealth is not only their reward for adherence to the beliefs described above, but the ultimate proof that they are in the right.</p><h2><strong>From ideology to myth</strong></h2><p>What makes this ideology so powerful&#8212;and so dangerous&#8212;is that it contains partial truths. Credentials can indeed be exclusionary. Technologies have improved and saved human lives. Some institutions do need reshaping. Efficiency can improve both business and government. Information sharing can foster innovation. Markets do often allocate resources effectively. </p><p>But each of these beliefs becomes problematic when taken to an extreme and divorced from broader values and contexts. The ideology elevates means (technology, markets, disruption) over ends (what kind of future we want to build).</p><p>As we face converging crises, the limitations of this ideology have become increasingly apparent. We need technological development guided by laws and a broader view of American values rather than billionaire whims. We need recognition that markets, while powerful tools, cannot substitute for collective decision-making about social priorities. And we need to reclaim the concept of innovation from those who have used it to advance their own interests at society's expense.</p><p><a href="https://www.tomorrowsmess.com/p/innovation-and-myth">In my next post</a>, I'll unpack how Silicon Valley has not only co-opted but fundamentally transformed our understanding of innovation itself&#8212;creating a powerful mythology that equates technological change with progress while obscuring the question of who truly benefits from this particular vision of the future.</p><p></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>I fell for it myself, back in 2014, when I left working in foreign aid to take a job at Amazon, where I thought I could do well by doing good (and pay off my student loans).</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>I refer only to men above purposely &#8212; in my observation, women rarely get credit for either credentials or creations in this culture.</p></div></div>]]></content:encoded></item><item><title><![CDATA[Welcome to Tomorrow’s Mess]]></title><description><![CDATA[Join me as I pick through the shards of what tech has broken in society, seek how we might fix things that are not yet beyond repair, and look around the corner at what might be broken next.]]></description><link>https://www.tomorrowsmess.com/p/coming-soon</link><guid isPermaLink="false">https://www.tomorrowsmess.com/p/coming-soon</guid><dc:creator><![CDATA[Casey Mock]]></dc:creator><pubDate>Fri, 29 Nov 2024 02:12:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F387b9a53-af49-4ce7-bb41-f69f93de6db5_619x619.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In the family I was raised in &#8211; as I&#8217;m confident was the case for many readers &#8211; you clean up after yourself. If you&#8217;re staying with family or friends, you don&#8217;t leave a mess for your hosts to clean up; you strip the bed linens when you leave and don&#8217;t leave your dirty towels on the bathroom floor. It&#8217;s not just good manners, but you hope for the same treatment with your own home when folks visit you. That reciprocity is core to how human hospitality has worked for generations: from the Old Testament to classical Greek literature to the Qu&#8217;ran to the Mahabharata, all contain admonitions about what makes a good guest and a good host that are still familiar to us today.</p><p>And yet after a decade in tech policy, I have repeatedly found myself returning to a simple question: how did &#8220;move fast and break things&#8221; become the mantra of the American tech elite? How did it become the default that we permit tech elites to &#8220;disrupt&#8221; the basic functioning of society &#8211; not just of media and politics, but also at kitchen tables and on playgrounds &#8211; and bear no burden or costs for fixing what they break? How has it come to pass that we&#8217;ve made the men who insist that rules don&#8217;t apply to them not outcasts but the richest men in the history of the planet instead?</p><p>Section 230, weaponization of the First Amendment, the influence of money in politics, and the influence of the tech lobby in DC and state capitals are all part of the story &#8211; but not the whole story. The source of &#8220;move fast and break things&#8221; is not just the mere pursuit of profit, but is instead cultural and ideological: nonprofit tech accountability organizations struggle with accountability themselves, and we at this point are familiar with stories of techno-solutionists who claim fidelity to serving humanity both causing great harm.</p><p>The observation that Silicon Valley has a unique culture is not a new one: from the 1995 essay &#8220;The California Ideology&#8221; by Richard Barbrook and Andy Cameron to the six seasons of the HBO show <em>Silicon Valley</em>, that there&#8217;s something that seems &#8211; to the rest of Americans &#8211; &#8220;off&#8221; about Bay Area technologists is a cold take. But here at <strong>Tomorrow's Mess,</strong> I will be more deeply examining how developments in emerging technology and technology&#8217;s influence on politics can be better understood through unpacking the deeper ideologies of the tech elite, and what those ideologies can predict about what is coming next.</p><p>This isn't another newsletter deflating the hype around the latest generative AI developments or cataloguing the newest outrages coming from the tech elite and the harms their products cause &#8212; there are already great sources on that, from <a href="https://garymarcus.substack.com/">Marcus on AI</a> to <a href="https://www.bloodinthemachine.com/">Blood in the Machine</a> to <a href="https://www.aisnakeoil.com/">AI Snake Oil</a> to <a href="https://www.afterbabel.com/">After Babel</a>. <strong>Tomorrow&#8217;s Mess</strong> is instead about the cultural DNA of technology&#8212;how the values and blind spots of those building tomorrow's tools shape all our futures. Fundamentally, this is a newsletter about not just technology but about technologists, politics, and culture; and about unintended consequences, humility, hubris, incentives, and intentions.</p><p>Join me as I pick through the shards of broken things to understand why they broke, seek how we might fix things that are not yet beyond repair, and look around the corner at what might be broken next.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.tomorrowsmess.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.tomorrowsmess.com/subscribe?"><span>Subscribe now</span></a></p><h1>About Me</h1><p>One day in 2009, in a dilapidated village classroom in the former Soviet Republic of Moldova and not long after <a href="https://foreignpolicy.com/2009/04/07/moldovas-twitter-revolution/">a failed Twitter revolution</a> in that country, a group of my students (I taught 12th grade as a Peace Corps Volunteer there) asked me to visit a website on my laptop: the homepage of the Russian social media called <a href="https://ok.ru/dk?st.cmd=anonymMain&amp;st.redirect=%2Fguests%3Fst.layer.cmd%3DPopLayerClose">Odnoklassniki</a> ("Classmates&#8221;). My students had never heard of Facebook, Instagram was still a year away, with Snap and TikTok even further into the future &#8212; but they were desperate to know what I thought of their Odnoklassniki profiles and wanted me to sign up for one.</p><p>This classroom moment marked my first real understanding of technology's power as a social architect. What looked like teenage social media use was actually a case study in how digital design shapes human behavior at scale. Odnoklassniki&#8217;s design  &#8212; which was essentially "<a href="https://en.wikipedia.org/wiki/Hot_or_Not">Hot or Not</a>" with DMs &#8212; created an incentive for girls to post increasingly sexualized content for boys to rate and rank; boys could buy extra votes with mobile phone minutes, and thus would stand out to girls whose attention they wanted. And so it quickly became the case that students who once could barely wait for nightfall on summer Saturday nights so they could go to dance at the village discotheque now only showed up to vamp for photos and go home.</p><p>Ondnoklassniki wasn't broken. It was working exactly as designed. And the impact was not just confined to the classroom and the village disco. <a href="https://www.newyorker.com/magazine/2008/05/05/the-countertraffickers">Moldova had long struggled with trafficking in young women for sexual exploitation</a>. Odnoklassniki&#8217;s design proved to be the perfect tool for traffickers to identify and groom prospective victims; it also later became <a href="https://ipp.md/en/2018-02/propaganda-ruseasca-pe-odnoklassniki-cazul-republicii-moldova/">a key vector for Russian disinformation</a> in advance of <a href="https://edition.cnn.com/2024/11/04/europe/moldova-election-sandu-putin-interference-intl/index.html">Russia&#8217;s continued attempts to influence</a> the most consequential elections in the country&#8217;s short independent history. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!UMZK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!UMZK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 424w, https://substackcdn.com/image/fetch/$s_!UMZK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 848w, https://substackcdn.com/image/fetch/$s_!UMZK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!UMZK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!UMZK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg" width="1086" height="724" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:724,&quot;width&quot;:1086,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:261496,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.tomorrowsmess.com/i/152306227?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!UMZK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 424w, https://substackcdn.com/image/fetch/$s_!UMZK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 848w, https://substackcdn.com/image/fetch/$s_!UMZK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!UMZK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8c0c83dd-fef8-4592-8e8d-e27513612633_1086x724.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Your author and his Moldovan 12th graders in 2010</figcaption></figure></div><p>That formative experience in Moldova revealed something I couldn't unsee: technology isn&#8217;t necessarily neutral, and the consequences of technologists&#8217; design choices aren't confined to the digital realm. The years since have taken me through roles on all sides of the tech policy divide&#8212;from Amazon&#8217;s policy team to state government, from international development to civil society. Along the way, I've witnessed firsthand how the values embedded in technology design ripple outward, reshaping politics, communities, and individual lives, while those who build these systems rarely confront the messes they create.</p><p>Here at <strong>Tomorrow's Mess</strong>, I&#8217;ll be unpacking the ideologies driving Silicon Valley's most powerful figures and examining who pays the price when "disruption" becomes a virtue. Drawing on thinkers like Postman and McLuhan, I translate tech's deliberately obscure language into clear insights about what's coming next.</p><p></p>]]></content:encoded></item></channel></rss>