Contestra

Running for Ratings: When Algorithms Become Game Show Hosts

Running for Ratings: When Algorithms Become Game Show Hosts

In 1987, The Running Man depicted a dystopian 2019 where convicted criminals competed on a televised game show, hunted by theatrical killers called "stalkers" while audiences cheered. The show's host, Damon Killian, selected contestants based on ratings potential and manipulated footage to control narratives.

The film was meant as satire. Thirty-eight years later, it reads more like a product roadmap.

"The Running Man wasn't predicting killer game shows. It was predicting algorithmic entertainment—content systems that optimize for engagement regardless of human cost."

The Algorithm as Game Show Host

Damon Killian's job was simple: maximize ratings. He chose contestants who would generate drama, edited footage to craft narratives, and gave audiences exactly what they wanted—even when what they wanted was blood.

Modern recommendation algorithms perform the same function [1]:

  1. Contestant selection — TikTok's algorithm surfaces creators based on engagement potential. Go viral or disappear. The runners who generate views survive; the rest fade into algorithmic obscurity.

  2. Narrative control — YouTube's recommendation system shapes what stories get told by deciding what gets amplified. A creator's message is filtered through what the algorithm rewards.

  3. Audience optimization — Netflix A/B tests thumbnails to maximize click-through. The "best" image isn't the most accurate—it's the one that gets the most clicks.

  4. Stakes escalation — Platforms reward increasingly extreme content. The Running Man's stalkers got more theatrical each season. Social media creators face the same pressure.

  5. Live metrics — Killian watched ratings in real-time. Streamers watch viewer counts, adjusting content second-by-second based on what keeps numbers up.

"Every platform is a game show now. The prize is attention. The penalty for losing is invisibility."

The Audience Problem

The Running Man's audience cheered for death. They weren't evil—they were entertained. The show gave them spectacle, narrative, heroes and villains. They consumed what was offered.

Modern audiences face the same dynamic:

  1. Engagement feedback loops — Platforms surface content we engage with [2]. We engage with content that triggers strong emotions. Outrage, fear, and controversy generate engagement. The loop tightens.

  2. Passive consumption — The Running Man's viewers didn't choose episodes; they watched what aired. Infinite scroll removes choice in similar ways. The algorithm selects; we consume.

  3. Normalized extremity — Each season of The Running Man needed bigger spectacles. Each year of social media pushes boundaries further. Yesterday's shocking is today's mundane.

  4. Collective culpability — Killian blamed the audience: "They love it." Platform executives make the same argument [3]. Users have agency, but systems shape choices.

"The Running Man understood that entertainment systems aren't neutral. They create appetites, then feed them, then need bigger meals."

Building Different Games

At Contestra, we reject the ratings-at-all-costs model. AI systems can be designed for outcomes other than maximum engagement:

  1. Goal alignment — Systems should optimize for user-defined objectives, not platform metrics. A research tool should help you learn, not keep you scrolling.

  2. Transparent incentives — Users deserve to know what systems are optimizing for. Hidden engagement maximization is manipulation.

  3. Bounded engagement — Responsible systems include natural stopping points. Infinite scroll is a design choice, not an inevitability.

  4. Quality over quantity — Time-on-platform is a crude metric. Actual value delivered matters more than minutes consumed.

  5. Human oversight — Algorithms shouldn't run unsupervised. Regular audits of what's being promoted—and what's being buried—are essential.

The Resistance Network

The Running Man featured an underground resistance broadcasting truth through pirated signals. They exposed the show's manipulation by releasing unedited footage.

Today's equivalent is algorithmic transparency, alternative metrics, and systems designed for human benefit rather than engagement extraction.

The game show doesn't have to win.

"We can build AI systems that serve users instead of harvesting them. The technology is neutral. The design choices aren't."

[1]
J. Stray, “What are you optimizing for? Aligning recommender systems with human values,” arXiv preprint arXiv:2107.10939, 2021.
[2]
T. Wu, The attention merchants: The epic scramble to get inside our heads. Knopf, 2016.
[3]
S. Zuboff, The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs, 2019.