How I Added “TL;DR by Goose” AI Summaries to Every Post
Using Claude Haiku to generate static summaries at publish time — and why the architecture decision matters more than the feature itself
Ingredients
- Claude Code — terminal-based AI for building the badge component and scripts ($200/yr)
- Claude API — Haiku model — for generating the summaries (own API key, pay-per-use)
- @anthropic-ai/sdk — the official Anthropic npm package (free)
- Next.js — the framework running the site (free)
- Vercel — hosting and deployment (free)
What I Was Trying to Solve
Every post on this site is a long-form build log. Useful if you’re going to read it, but there’s no way to quickly gauge whether it’s worth your time. I wanted a one-paragraph summary on each post — written by AI, not me — that a reader could expand before committing.
The obvious implementation: call the Claude API on every page load and generate a summary in real time. I didn’t do that. Here’s why.
The Architecture Decision
Summaries don’t need to be fresh. A post written in February doesn’t need a new summary generated every time someone visits in March. Calling the API on every page load would mean: latency on every read, API costs that scale with traffic, and a broken experience if the API is down.
Instead: generate once at publish time, save to a static JSON file, bundle it with the build. The reader gets an instant static string. Zero runtime API calls. Zero cost at scale.
This is a build-vs-runtime decision — and it’s the kind of product thinking that’s worth making explicit, because the “obvious” implementation and the right implementation aren’t always the same.
The Build
The system has three parts: a generation script, a static data file, and a badge component.
The Generation Script
scripts/generate-tldr.ts reads a post’s TSX file, strips the JSX tags to extract readable prose, sends it to Claude Haiku with a tight prompt, and writes the result to app/lib/tldr.json. Running npx tsx scripts/generate-tldr.ts --all batched all 8 posts in about 30 seconds.
8 summaries generated in ~30 seconds. Results saved to app/lib/tldr.json.
🔧 Developer section: generation script
- Script lives at
scripts/generate-tldr.tsand runs withnpx tsx— no global install needed - JSX stripping uses a regex to remove all HTML/JSX tags and collapse whitespace, leaving only readable prose for the Haiku prompt
- The
--allflag reads slugs directly fromapp/lib/posts.tsso the script stays in sync with the post list — no separate config to maintain - Each result is written to
app/lib/tldr.jsonkeyed by slug; existing entries are preserved unless explicitly overwritten - Haiku is called with
max_tokens: 200and a singleusermessage — no system prompt, no conversation history
The Badge Component
TLDRBadge.tsx reads from the JSON file and renders a collapsed pill — Goose’s photo, the label “TL;DR by Goose”, a chevron — that expands on click to show the summary. No API calls. No loading state. Just a static string pulled from a bundled JSON file.
Going forward, the workflow is: write post → run the script for that slug → deploy. One extra command per publish.
🔧 Developer section: TLDRBadge component
TLDRBadge.tsxis a"use client"component — required because it usesuseStatefor the expand/collapse toggle- The JSON file is imported statically:
import tldr from "@/app/lib/tldr.json"— Next.js bundles it at build time, so no runtime fetch - The Goose avatar is rendered with
next/imageat 39×39px withborder-radius: 50%applied via CSS - If no summary exists for the slug, the component returns
null— silent no-op, no broken UI - The chevron rotates via a conditional CSS class (
tldr-chevron-open) rather than inline styles, keeping animation in CSS
Failure & Fallback
The env var that wasn’t there
The first run produced nothing. The script ran, printed no errors, and tldr.json stayed empty.
The issue: tsx doesn’t auto-load .env.local. The ANTHROPIC_API_KEY was set locally but the script couldn’t see it, so the Anthropic client initialized without a key and failed silently. The fix was adding a few lines to the script to manually read and parse .env.local before initializing the client.
This is the kind of failure that’s easy to miss: no crash, no error message, just a quiet no-op. If I hadn’t checked tldr.json after the run, I might have deployed empty summaries and not noticed until a reader clicked the badge.
🔧 Developer section: env var loading
tsxis a TypeScript script runner — it does not replicate Next.js’s environment loading behavior, so.env.localis invisible to it by default- Fix: read the file with
fs.readFileSync, split on newlines, parse eachKEY=valuepair, and assign toprocess.envbefore the Anthropic client is initialized - The Anthropic SDK fails silently when no API key is present — it initializes but returns empty responses rather than throwing, which is why the script appeared to succeed
- Alternative fix: prefix the command with
ANTHROPIC_API_KEY=... npx tsx ...inline — but the manual parse approach keeps the script self-contained
Haiku doesn’t count sentences
The prompt said “write a TL;DR in 2–3 sentences.” Several summaries came back as 4–6 sentences, reading more like dense paragraphs than quick previews. The content quality was good, so I kept them rather than re-running with a tighter prompt.
That’s a tradeoff worth flagging: Haiku treats length instructions as guidance, not hard constraints. If you need strict length control, you either post-process the output or add more explicit constraints and re-run. For now, the summaries work. But that’s a known rough edge, not a solved problem.
Final Output
A clickable “TL;DR by Goose” badge on all 8 Writing posts, powered by Claude Haiku, with zero runtime API calls and instant load time. The Anthropic API key stays local. The summaries ship as static JSON. Readers get a one-click preview before committing to a full read.
The bigger win: the generation script is now part of the publishing workflow. Every future post gets a Haiku-generated summary in 30 seconds, as part of the same session that writes and deploys the post.