Changelog for humans & agents.
One Web Component for your site. JSON Feed and RSS for AI agents. A tiny CLI for CI. Edge-native on Cloudflare Workers — no servers, no signup, no analytics in the read path.
Built for both audiences from day one.
Most changelog tools serve humans. Some serve RSS. bearlychange treats agent-readable, embed-friendly, and edge-native as table stakes — not afterthoughts.
Embed-first
One Web Component, two lines of HTML. Works in Astro, Next, WordPress, plain HTML — anywhere custom elements run.
Agent-readable
JSON API, JSON Feed 1.1, RSS 2.0 — all from the same store. Stable field names, plus an optional machine_summary per entry.
Edge-native
Cloudflare Workers + D1. Reads served from the nearest region. No origin server, no cold starts, no infra to babysit.
Stable output
Schema-validated entries. Predictable JSON shape. CI can rely on it; agents can integrate without scraping.
Draft → publish
Write entries as drafts, ship when ready. published_at auto-stamps on first publish. Edit and delete via HTML admin or CLI.
Open & self-hostable
MIT-licensed. Single Worker, single D1 database. Fits inside Cloudflare's free tier for typical volumes.
Every public surface, one page.
Hit any of these directly. Same data, different shape.
Accept: application/json for the structured record.<bearly-change> Web Component. Cached at the edge.Post from CI. Pipe from an agent.
A zero-dependency Node script. Talks to any bearlychange deployment over HTTP.
Post once, serve everyone.
Same entry, four surfaces. Agents and humans converge on the same source of truth.
Post
From the admin HTML form, the CLI, or a CI job. Validation runs server-side — bad slugs and versions get 400s.
Draft or publish
Drafts stay invisible to public surfaces. Publishing auto-stamps published_at and reveals the entry everywhere.
Serve from the edge
Cloudflare Workers handles every request. D1 holds the entries; reads run in the nearest region.
Consume anywhere
Humans see the widget on your site. Agents poll JSON. Feed readers subscribe to JSON Feed or RSS. Same data, four shapes.
Boring tech, on purpose.
Frequently asked questions
Everything you need to know about bearlychange.
bearlychange is an embed-first changelog service for sites and AI agents. It ships a one-line Web Component embed, a JSON API at /api/changelog, a JSON Feed 1.1 endpoint, an RSS 2.0 feed, and per-entry HTML and JSON pages. It runs on Cloudflare Workers with D1 for storage and is MIT-licensed.
Add a <script> tag pointing at /widget/widget.js and a <bearly-change> custom element with a src attribute pointing at /api/changelog. The widget is a plain Web Component, so it works in Astro, Next, WordPress, Webflow, or any framework that allows custom elements. No build step required.
bearlychange exposes a stable JSON API at /api/changelog and a JSON Feed 1.1 surface at /feed.json. Both return only published entries with predictable field names (id, slug, title, summary, type, version, modules, machine_summary, published_at). Each entry also includes an optional machine_summary field intended for LLM consumption.
Use the bearlychange CLI with a bearer token. Set BEARLYCHANGE_URL and BEARLYCHANGE_TOKEN as env vars, then run bc create --title "Shipped X" --slug shipped-x --summary "What changed" --version 0.4.0. For agents, pipe a JSON payload into bc create --from-stdin. The CLI is a zero-dependency Node script and works against any bearlychange deployment.
No account is required to read public changelog entries — the JSON API, JSON Feed, RSS, per-entry pages, and embed widget are open. Admin write access (create, update, delete) requires Basic auth credentials or a bearer token configured on the deployed instance. There is no signup, no email collection, and no analytics on the public surface.
bearlychange stores entries in Cloudflare D1, a serverless SQLite database that replicates to the edge. Reads run from the same region as the request; writes go through the primary. Data never leaves Cloudflare's network.
Yes. Clone the repo, run wrangler d1 create your-db, update wrangler.toml with the database ID, run the migration in migrations/0001_initial.sql, set ADMIN_PASS and BEARLYCHANGE_TOKEN secrets, and run wrangler deploy. The whole stack is Cloudflare Workers + D1 + Hono.
Five public read surfaces: /api/changelog (JSON, all published entries), /feed.json (JSON Feed 1.1), /rss.xml (RSS 2.0), /entries/:slug (HTML by default, JSON when Accept: application/json), and /widget/widget.js (the Web Component embed).
bearlychange is at version 0.1.0 and in early production use. The HTTP API, CLI, embed widget, validation, CSRF guard, bearer-token auth, and Workers+D1 backend are all implemented and covered by an end-to-end smoke test (24 assertions, ~15s). Multi-project workspaces, webhook integrations, and per-entry discussions are on the roadmap but not yet shipped.
Yes. bearlychange is MIT-licensed and the full source is on GitHub at github.com/blacklogos/bearlychange. The hosted instance is free to read from. Self-hosting on Cloudflare Workers + D1 fits within Cloudflare's free tier for typical changelog volumes.