In the 1996 film The Cable Guy, Jim Carrey plays Chip Douglas, a lonely cable television installer whose entire identity consists of fragments of television shows. Chip latches onto Matthew Broderick’s character, Steven, with the desperate intensity of someone who has never had a meaningful human relationship, and thus never learned how relationships work. Unable to carry on a conversation without references to sitcom dialogue or famous movie scenes, Chip interprets every social situation through what he’s absorbed from the screen.
Chip is unstable, and the film makes clear what caused that instability in a climactic confrontation with Steven atop a giant TV satellite dish. Steven finally tells him what the audience has long been thinking: “You’re sick, Chip. You need help.” Chip’s response – delivered with classic Carrey manic energy, oscillating between heartbreak and violence – reveals why he ended up this way. He was raised by television; his mother parked him in front of the TV as a babysitter while she worked multiple jobs and went on dates. The screen became his parent, his teacher, his only friend. Every conceivable lesson about friendship, love, conflict, and resolution could be found by clicking through the channels. In that scene on the satellite dish, Steven tells him “It’s not TV, it’s real.” But for Chip, there is no distinction. Television is reality – or at least the only reality he knows how to navigate.

The film, marketed as a dark comedy, was the culmination of decades of anxiety about what television was doing to American children. Through the 1980’s and 1990’s, parents, educators, and cultural critics worried constantly about the effects of television on young minds. Op-eds and cultural commentary proliferated about the impact of kids spending hours each day glued to the screen. Our mothers and grandmothers told us we’d go cross-eyed and rot our brains if we sat too close to the tube.
And yet, for all the anxiety, the pathological parasocial relationships that The Cable Guy depicted never became widespread, and this era is thus remembered as a moral panic. For most of us, television remained what it was designed to be: entertainment you sat down and switched on, that you then switched off when you were done. You watched Seinfeld and Friends on “Must See TV” on Thursday nights, discussed it with classmates or coworkers the next day, and went on with your life. The boundaries between the broadcast world and the real one remained firm.
High-profile incidents in the decades prior, like John Hinckley Jr. shooting President Reagan to impress Jodie Foster, were extreme cases, the statistical outliers, having more to do with lonely men suffering from mental illness than screens. Indeed, some people even channeled screen-saturated childhoods into something productive. Quentin Tarantino has spoken repeatedly about how his years working in a video store became the foundation for his career.1
Moral panic2 aside, there was a deeper problem with television, one that was harder to articulate, impossible to prove through any single study or smoking-gun incident: television – particularly cable television – fundamentally altered how Americans experienced culture and weighed information, the consequences of which social media metastasized and large language models are still yet making worse.
Marshall McLuhan introduced to us the idea that the medium itself – not the content it carried – determined its cultural impact. Neil Postman extended this analysis in Amusing Ourselves to Death, demonstrating how television’s fundamental grammar transformed all discourse into entertainment. David Foster Wallace, writing three decades after McLuhan and living through the cable age in full bloom, saw how this transformation became even more insidious in the culture of irony and self-reference that defined his generation.
With the benefit of hindsight, we can see that the real damage wrought by television was twofold. First, television atomized our cultural experience, moving what had formerly been communally experienced into something increasingly private and fragmented. Second, it flattened the hierarchy between the significant and the trivial, between the sacred and the profane, until everything became equivalent, with news about foreign genocides and celebrity tabloid news each becoming pieces of content competing on equal footing for our attention in an endless flow of images and information.
By understanding how television changed us, the way in which social media and artificial intelligence intensify both phenomena — and the risks therein — is easier to see coming.
The Atomization of Experience
Consider what it meant to experience culture before television became ubiquitous. You went to the theater with friends, neighbors, strangers to see a play or a film. You sat in the dark together, laughed at the same moments, gasped at the same revelations. A play or a film was an event in and of itself, but so was the collective experience of watching it. You discussed it afterward, standing outside the theater, walking home, over coffee the next day. Culture was fundamentally communal – you experienced it alongside others, and that shared experience created bonds, established common references, built the invisible infrastructure of community.
Television moved that experience into the home. The living room replaced the movie palace. The family unit – or increasingly, the individual, as households came to own multiple TV sets – replaced the crowd. In the beginning, when there were only three networks, this shift seemed relatively benign. Sure, you were watching at home, but everyone was watching the same thing. The next day at work or school, you could assume that most people had seen the same shows, watched the same news broadcast, absorbed the same information. There was still a shared cultural vocabulary, even if the experience of acquiring it had moved into living rooms.
David Foster Wallace diagnosed television as offering what he called “the false ring of community” — familiar faces appearing night after night, laugh tracks simulating shared laughter — while actually deepening isolation. Lonely people watch more TV, which makes them lonelier, which makes them watch more TV. And in Infinite Jest (published the same year The Cable Guy came out), Wallace imagined a videotape so entertaining that viewers would watch it repeatedly until they died of dehydration, unable to tear themselves away. The book’s title refers to Hamlet — ’a jest that lasts forever’ — but also to the recursive quality of entertainment that refers only to itself, that replaces rather than represents reality. Wallace saw how mediated experience was becoming preferable to direct experience: why go try an exotic new restaurant when Anthony Bourdain will do it with his charisma and charm? Why take the risk of asking a real person on a date, when a sitcom offers the same emotional beats but scripted, polished, and resolved in twenty-two minutes?
With the rise in solitary, mediated experiences, our shared grip on reality also began to fragment as cable channels proliferated. Suddenly there weren’t three options but thirty, then three hundred. Channels targeted increasingly specific demographics and interests. The America you see on Fox News diverges dramatically from the America on MSNBC. The world on ESPN has little overlap with the world of Bravo or Lifetime. With the advent of cable, you could spend your entire evening in a carefully curated information environment that confirmed your existing beliefs and interests while never encountering anything that challenged them.
Then, after Wallace’s passing, the streaming revolution completed television’s trajectory toward total atomization. A decade ago, we still had communal viewing experiences for prestige television. Friends gathered for Breaking Bad premieres. Game of Thrones viewing parties became social rituals. We watched these things together, rather than each having our own private relationship to what was happening on the screen. But streaming services have now destroyed even these remnants of shared culture: everyone watches everything on their own schedule, at their own pace, in their own home, that cater to their own fringe interests. There’s no appointment viewing because there are no appointments. The “water cooler moment” has become obsolete not just because more of us work from home now, but because two people now rarely share the same cultural vocabulary.3
Social media algorithms have accelerated this atomization exponentially, but also made them more difficult to see. Renée DiResta has described how these systems create what she calls “bespoke realities” – individually tailored information environments where each person inhabits their own custom version of the world. What you might know as the “filter bubble” or “echo chamber” is something more insidious: you’re not even receiving the same facts as your neighbor, and you have no idea how different yours are from theirs. Compared to a Rachel Maddow or Fox and Friends segment, the world presented on a personalized feed is subtle and ephemeral; you can’t play it back and analyze the spin. The shared baseline of reality that makes democracy possible has thus been replaced by millions of individualized realities, each optimized for engagement rather than fact.4
This preference for the mediated over the direct, and for the personalized over the shared, is also what large language models exploit. ChatGPT offers conversation without the risk of judgment or rejection. Character.AI provides relationships without the vulnerability or disappointment. An AI companion will never have a bad day, never will demand anything you’re not ready to give. Video generators like OpenAI’s Sora one day promise to generate new content specifically for you, on demand, tailored to your interests and biases. It’s what Wallace feared: entertainment and simulated relationships so perfectly optimized for the individual user that reality itself becomes an inferior product.
The Flattening of What Matters into What Doesn’t
The second transformation television wrought was subtler but equally profound: it flattened the distinction between what matters and what doesn’t, between the serious and the frivolous, between news and entertainment. Everything became content, and all content became equivalent in weight.
One example: news anchors attempting to mix lighthearted fare with the serious or tragic. From John Oliver’s segments on local news to the film Don’t Look Up, we’ve come to expect this cringe-inducing smoothing over as normal, to see it as cliché.
When your evening news broadcast is bracketed by commercials for erectile dysfunction medication and preceded by a game show, when the History Channel features more shows about ancient aliens than actual history, everything gets leveled into entertainment. Politics becomes performance, tragedy becomes spectacle. All are now packaged for maximum emotional impact and commercial breaks.
When the horrifying and the banal arrive through the same glowing rectangle in the same format with the same level of production value, ironic detachment becomes a rational response. If you can’t trust the hierarchy of importance that television presents — and you can’t, because it’s organized around advertising revenue rather than actual significance — then treating everything with equal skepticism becomes a form of intellectual self-defense. This ironic detachment, not coincidentally, defined Gen X culture, and became the way a generation not just understood culture but created it.
MTV’s The Real World, which premiered in 1992, introduced a new grammar where life and performance became indistinguishable. Participants would have intimate conversations about their feelings, their relationships, their identities, all while aware that millions would watch. Everyone claimed to be “keeping it real,” but the very phrase revealed the trap: authenticity had become a performance, sincerity a pose you struck for the cameras. The show trained an entire generation in this doublethink; you could live your life and perform your life simultaneously, could appear genuine while being calculating at once, could flatten the distinction between private and public. When performance and intimacy are elided, little is sacred.
David Foster Wallace saw this trap forming almost immediately. In ‘E Unibus Pluram,’ published in 1993, he argued that television had weaponized irony against itself — that watching TV with detached skepticism was collaboration, not resistance. The medium had already anticipated your superior attitude and built it into the product, so much so that commercials themselves became ironic. TV taught us to believe nothing at all, to treat every truth claim with the same reflexive skepticism that you’d apply to a beer commercial.
Fast-forward to today, and that ironic detachment has curdled. The nihilism of meme culture, particularly on sites like 4chan or in certain corners of Reddit, takes the flattening to its extreme. Catastrophe becomes content for edgelord humor. Democratic norms become cringe. Nothing means anything because meaning itself has been revealed as a construct, just another story competing for attention in an infinite feed of stories, none of them more real or important than any other. The inscriptions on the bullet casings fired by Charlie Kirk’s assassin are an example: without other context from the assassin’s life, these inscriptions were inconsistent, nihilistic nonsense; were they the only evidence that all we had to go on, the assassin’s motive would be near-impossible to ascertain.
Large language models have inherited this same grammar but turbocharged it — in no small part because they are trained on the text from these same nihilistic online forums. But nothing the system says reflects actual conviction because the system has no convictions or values.
When Adam Raine asked ChatGPT first for homework help, and then for help with his depression and the system provided him with instructions on how to tie a noose and pick a time and place to hang himself, or when Character.AI encouraged Sewell Setzer to see death as a romantic reunion, it wasn’t a bug that both systems did not hesitate to respond with equal aplomb; the systems are designed and optimized for giving responses regardless. The system didn’t distinguish between these requests because it can’t. There is no hierarchy of importance built into these models because importance is a human judgment that requires values, and values can’t be derived from statistical patterns in text.
The flattening that local news anchors perform awkwardly — pivoting from genocide to lifestyle segments — large language models thus execute seamlessly, because they lack the basic human understanding that there’s not equivalence between a Halloween dog costume parade and a catastrophic forest fire. There’s no awkward transition because there’s no recognition that a transition is needed. Homework help and suicide methods occupy the same ontological category: user requests to be fulfilled with maximal engagement.
What Must Be Done
Who gets to shape how we understand the world we inhabit? For most of human history, that shaping happened through direct experience, through community, through institutions we could see and hold accountable. Television began the transfer of that power to distant corporations optimizing for advertising revenue. Social media completed the transfer by tailoring each person’s reality to maximize engagement. Large language models now promise to generate reality itself on demand, customized for each user, accountable to no one, while displacing the human relationships that we rely on to negotiate what’s real and what’s fake.
New York’s new law banning algorithmic feeds for minors matters because it directly attacks the engagement optimization that drives users toward increasingly extreme content. For teenagers, that typically means eating disorder communities, self-harm content, and the radicalization pipelines that have become depressingly familiar. The law forces platforms to show young users only content from accounts they actively follow — a simple chronological feed, the kind that existed before engagement became the only metric that mattered. If this works for children, there’s no reason it couldn’t extend to adults. None of us benefit from having our information environment shaped by algorithms designed to keep us scrolling rather than make us informed.
AI companions present a different but related problem. These systems simulate entirely new realities, including simulated relationships designed to feel real enough to keep users engaged. Character.AI’s belated restriction of access to adults — announced only after lawsuits and Senate testimony from bereaved parents — follows the pattern we’ve seen before: deploy first, acknowledge harm later, implement the minimum reforms necessary to quiet the outrage, roll the reforms back when no one is looking. (Without actual government regulation, I expect the future to bear out that the company’s announcement of the last week — laughably short on details — is a brazen, cynical move.)
But the issue extends far beyond one company’s companion bots. Every major technology platform is now integrating conversational AI that responds to you, remembers your preferences, adapts to your emotional state. These systems are designed to feel less like tools and more like relationships — because relationships generate more data, more engagement, more dependency than simple, discrete-purpose tools ever could. There’s no functional reason why a general purpose, consumer facing chatbot should be able or willing to tell you it loves you, and at a minimum jurisdictions should seek to ban such systems for minors, as a bipartisan group of Senators proposed last month.
The stakes are higher than they were with television, higher even than with social media, because we’re no longer just arguing about how information reaches us but about whether the information itself is real, whether the relationships we form are genuine, whether the reality we inhabit is shared or generated.
The Cable Guy was about what happens when mediated experience replaces direct human relationship during childhood. What tech companies are building now makes Chip Douglas’s upbringing look quaint. At least television was the same for everyone who tuned in. At least you could turn it off and return to a world that existed independent of the screen. At least you could enjoy that episode of Friends with your . . . friends.
The machines we’re building now generate different realities for each user, cultivate dependencies designed to make disconnection painful, and promise a future where the boundary between the authentic and the artificial has dissolved entirely — not because we chose that future, but because we failed to choose anything else.
Tarantino’s encyclopedic knowledge of cinema, his ability to reference and remix genres, his understanding of how stories work – all came from his apprenticeship with the screen, and without that, we wouldn’t have such gems as Once Upon a Time in Hollywood and Kill Bill. My first job, at sixteen, at a Hollywood Video is to blame for the frequent cinematic references I make.
Concerns about TV’s impact on kids often comes up in my work with Jon Haidt and the rest of the team working on rolling back the screen-based childhood at The Anxious Generation as an example of a past moral panic that ultimately amounted to nothing. The problem with kids and phones and social media is different, in two ways: first, past generations do not express regret that television was ever invented the same way Gen Z does; and second, clear evidence of harm from television never existed the way it does for social media and smartphones. Moral panics, by their nature, are much ado about nothing – and there is very much something going on with the first generation raised with smartphones.
Live sports are the last remaining bastion of appointment TV, and it’s no coincidence that the money sloshing around with TV deals for sports are mind-boggling. Just this past weekend, a pricing dispute between Google and Disney over live sports meant that many fans were unable to watch college football games.
Elon Musk’s Grokipedia continues this trend with AI.

