πŸ›‘οΈSatisfaction guaranteed β€” Setup refunded if not satisfied after 30 days

Deepthix
← Back to blog
opinionFebruary 14, 2026

"First, Make Me Care": The Forgotten Art of Capturing Attention

Why most content fails before it even starts. Gwern's essay on narrative engagement and what it teaches us about writing in the AI era.

The problem nobody admits

Before convincing, before explaining, before selling β€” you must first capture. And most content fails at this first step. Not because it's false or poorly written, but because it never gave the reader a reason to continue.

Gwern, the anonymous essayist whose site has become a reference in tech and rationalist circles, crystallized this idea in a formula: "First, make me care." The rest comes after.

The classic mistake

The mistake most authors make β€” technical or not β€” is starting with context. "Artificial intelligence is a rapidly expanding field..." No. Nobody has ever been captivated by context. People are captivated by tensions, mysteries, stakes that concern them.

Gwern cites the Pixar rule: "You admire a character more for their efforts than their successes." Narrative engagement is born from friction, not information. Before knowing how something works, we want to know why it matters.

What this changes for technical writing

In technical writing β€” documentation, blog posts, presentations β€” the temptation is to structure logically: definition, context, explanation, conclusion. It's clean. It's also dead.

The alternative: start with the problem the reader feels. Not the abstract problem, the lived problem. "You've spent three hours debugging this code and you still don't understand why it fails." Now the reader is listening, because you're talking about them.

The attention economy in 2026

This question becomes more urgent with generative AI. When anyone can produce 10,000 coherent words in 30 seconds, scarcity is no longer in production β€” it's in attention. An article that doesn't capture in the first 10 seconds isn't read, regardless of its intrinsic quality.

Content creators have understood this intuitively. YouTube hooks, newsletter opening paragraphs, hook tweets β€” everything is optimized for the initial "make me care." But many confuse "capturing attention" with "clickbait." The difference: clickbait promises without delivering, narrative engagement promises and delivers.

Techniques that work

Gwern identifies several effective patterns:

Mystery: pose a question the reader wants answered. Not an empty rhetorical question, a real unresolved tension.

Personal stakes: connect the subject to something the reader values β€” their money, time, health, identity.

Subversion: start from a common belief and challenge it. "You think X, but actually Y."

Concrete before abstract: a specific anecdote before generalization. A case before the rule.

What this implies for AI

LLMs write by default in a generic, contextualized, flat manner. It's their natural mode β€” producing text that resembles the average of their training corpus. "Make me care" is precisely what escapes this average.

This is also why the best prompts don't ask "write an article about X" but "write as if you had to convince a skeptical reader who has better things to do." The constraint forces friction.

In 2026, knowing how to write is no longer enough. Knowing how to capture, however, remains rare. And it may be the most valuable skill in a world drowning in content.

ecriturestorytellinggwernattentioncontenuia

Want to automate your operations?

Let's discuss your project in 15 minutes.

Book a call