← All Writing
March 5, 20265 min read

How I Added “TL;DR by Goose” AI Summaries to Every Post

Using Claude Haiku to generate static summaries at publish time — and why the architecture decision matters more than the feature itself

YieldAI-generated TL;DR summaries on all 8 Writing posts, with a clickable Goose badge UI
Difficulty2 / 5 — the script is simple; the architectural thinking is the real work
Total Cook Time~3 hours in one session

Ingredients

What I Was Trying to Solve

Every post on this site is a long-form build log. Useful if you’re going to read it, but there’s no way to quickly gauge whether it’s worth your time. I wanted a one-paragraph summary on each post — written by AI, not me — that a reader could expand before committing.

The obvious implementation: call the Claude API on every page load and generate a summary in real time. I didn’t do that. Here’s why.

The Architecture Decision

Summaries don’t need to be fresh. A post written in February doesn’t need a new summary generated every time someone visits in March. Calling the API on every page load would mean: latency on every read, API costs that scale with traffic, and a broken experience if the API is down.

Instead: generate once at publish time, save to a static JSON file, bundle it with the build. The reader gets an instant static string. Zero runtime API calls. Zero cost at scale.

This is a build-vs-runtime decision — and it’s the kind of product thinking that’s worth making explicit, because the “obvious” implementation and the right implementation aren’t always the same.

The Build

Evening, March 5 — ~3 hours

The system has three parts: a generation script, a static data file, and a badge component.

The Generation Script

scripts/generate-tldr.ts reads a post’s TSX file, strips the JSX tags to extract readable prose, sends it to Claude Haiku with a tight prompt, and writes the result to app/lib/tldr.json. Running npx tsx scripts/generate-tldr.ts --all batched all 8 posts in about 30 seconds.

Terminal
user@MacBook-Air joseandgoose-site-main % npx tsx scripts/generate-tldr.ts --all
Generating TL;DRs for 8 posts...

how-i-built-search
how-i-automated-garmin-recaps
how-i-replaced-google-forms
how-i-built-greetings
how-i-built-footer
how-i-built-numerator
gemini-grades
how-i-built-this

Done.

8 summaries generated in ~30 seconds. Results saved to app/lib/tldr.json.

🔧 Developer section: generation script

The Badge Component

TLDRBadge.tsx reads from the JSON file and renders a collapsed pill — Goose’s photo, the label “TL;DR by Goose”, a chevron — that expands on click to show the summary. No API calls. No loading state. Just a static string pulled from a bundled JSON file.

Going forward, the workflow is: write post → run the script for that slug → deploy. One extra command per publish.

🔧 Developer section: TLDRBadge component

Failure & Fallback

The env var that wasn’t there

The first run produced nothing. The script ran, printed no errors, and tldr.json stayed empty.

The issue: tsx doesn’t auto-load .env.local. The ANTHROPIC_API_KEY was set locally but the script couldn’t see it, so the Anthropic client initialized without a key and failed silently. The fix was adding a few lines to the script to manually read and parse .env.local before initializing the client.

This is the kind of failure that’s easy to miss: no crash, no error message, just a quiet no-op. If I hadn’t checked tldr.json after the run, I might have deployed empty summaries and not noticed until a reader clicked the badge.

🔧 Developer section: env var loading

Haiku doesn’t count sentences

The prompt said “write a TL;DR in 2–3 sentences.” Several summaries came back as 4–6 sentences, reading more like dense paragraphs than quick previews. The content quality was good, so I kept them rather than re-running with a tighter prompt.

That’s a tradeoff worth flagging: Haiku treats length instructions as guidance, not hard constraints. If you need strict length control, you either post-process the output or add more explicit constraints and re-run. For now, the summaries work. But that’s a known rough edge, not a solved problem.

Final Output

A clickable “TL;DR by Goose” badge on all 8 Writing posts, powered by Claude Haiku, with zero runtime API calls and instant load time. The Anthropic API key stays local. The summaries ship as static JSON. Readers get a one-click preview before committing to a full read.

The bigger win: the generation script is now part of the publishing workflow. Every future post gets a Haiku-generated summary in 30 seconds, as part of the same session that writes and deploys the post.

← Back to all writing