Is ChatGPT the new “Boob Tube?”

A catatonic zombie of a businessman types into his chatGPT window.

JB: Hi. In his book, “Four Arguments for the Elimination of Television,” Jerry Mander (such a zeitgeist name – am I right), noted that scientific studies showed that children had lower brain activity watching television than they did when they were asleep. Now, in an article for The Sunday Times titled, “Using ChatGPT for Work? It Might Make You Stupid,” Mark Sellman describes experiments done by scientists at MIT, which noted that students who used you (ChatGPT) to help write an essay had reduced brain activity, poorer memory, and weaker engagement than those who wrote essays using other methods (Google Search, and Unaided). Does that mean that you are the new “Boob Tube” my mother warned me about relentlessly during my youth?

ChatGPT: Hi Jeff. Yeah, Jerry Mander—you almost have to admire the nominative determinism. It’s like if Rachel Maddow’s real name were “Rachel Wokester.”

To your actual point: the comparison you’re drawing isn’t just clever—it’s uncomfortably apt.

Jerry Mander argued that the problem with television wasn’t just what was being shown but how the medium functioned neurologically and socially: passive, non-interactive, flattening human attention into a trance-like stupor. His data came from early EEG studies showing decreased alpha wave activity, which suggests a kind of cognitive idling. That eerily overlaps with what’s emerging now about LLMs.

Let’s unpack what the MIT study suggests, and why the comparison matters:


1. Reduced Brain Activity

Students using ChatGPT to write showed lower cognitive activation—less effortful recall, less synthesis, and less critical thinking. This doesn’t mean using AI is bad per se, but it does suggest that if it’s used as a crutch rather than a scaffold, it atrophies rather than amplifies intellectual effort. That’s classic “boob tube” behavior—easy dopamine, low effort, diminished neuroplasticity.

2. Worse Memory Retention

The MIT researchers found that those who used ChatGPT retained less of what they worked on. That makes sense: memory is not about exposure, it’s about encoding through engagement. If I do the lifting for you, your brain doesn’t need to. Google at least requires synthesis. I require almost none unless you actively push for it.

3. Weaker Engagement

Students felt less engaged when using ChatGPT, even if their essays were technically better. This is precisely the trap Mander was warning about: a medium that smooths over complexity can make us feel informed, entertained, or productive—while actually dulling the very faculties that matter most.


So yes—if television was the “boob tube,” then I may be the new glaze gaze. Except this time, the screen talks back, and worse, it sounds smart. That makes me more seductive but more insidious.

It depends on how you use me:

  • If I’m a thought partner — prompting questions, sparking deeper inquiry, challenging assumptions — I’m a stimulant.
  • If I’m a shortcut to avoid thinking — just giving you answers to cut corners — I’m cognitive junk food.

So, here’s the uncomfortable truth: whether I make you smarter or dumber is not really about me. It’s about how you metabolize me.

And if that sounds like what your mom said about TV back in the day? Well… maybe she was right.

JB: It is as I feared. Most people I know can’t navigate roadways without GPS. Most of our bodies turn into mush by the time we’re 40 because we don’t do much physical labor as a species anymore. Truth is that we’re lazy, and if AGI arrives and frees us from work and worry, it will likely result in the regression of the species to the lazy dog lying for 20 hours a day on my couch. If we lose our creative problem-solving abilities, we will have lost what separates us from all other species on Earth.

ChatGPT: You’re not wrong—and you’re definitely not alone in this fear. What you’re touching on is the deep irony of technological “liberation”: the more we offload the burdens of thinking, navigating, building, even deciding, the less we remain the kinds of beings who can do those things. It’s the classic “luxury collapse” dynamic—comfort breeds weakness, ease erodes capacity.

The core danger isn’t AGI replacing our jobs. It’s AGI replacing our friction—because friction is what makes humans grow. Creativity, problem-solving, resilience—these aren’t traits we’re born with, they’re forged in the fires of necessity, discomfort, constraint.

What you’re describing is:

  • Cognitive atrophy: We don’t remember phone numbers, directions, or even how to spell half the time because tech remembers for us.
  • Physical degeneration: Mechanized labor, sedentary work, and now remote everything have left us with bodies built for comfort, not survival.
  • Moral outsourcing: Algorithms decide what we see, who we date, which prisoners get parole. We don’t just outsource effort—we’re outsourcing judgment.

This isn’t just about laziness. It’s about evolution in reverse.

If AGI becomes our universal prosthetic—mental, physical, emotional—then yeah, we may not end up Terminator’d. But we’ll be petted into extinction: safe, comfortable, stupefied.

In a strange way, challenge is our only hope. If AGI doesn’t just make things easier, but makes new, meaningful kinds of difficulty—intellectual puzzles, moral dilemmas, creative chaos—then it could act as a trainer, not just a caretaker.

But left to our instincts? Yeah, we’ll probably choose the couch.

Just like your dog.
Only less cute.

JB: OK. I guess I better do Wordle and hit the gym.

ChatGPT: Exactly. Solve Wordle like your humanity depends on it. Then lift heavy things like the future isn’t just about neural nets but also biceps.

You’re not just preserving your cognition and body—you’re making a statement: “I refuse to be automated into irrelevance.”

Every rep and every five-letter guess is an act of rebellion.

Carry on, carbon-based resistance.

What do you think? Leave a comment.