v0.1.0 · MIT GitHub →

Changelog for humans & agents.

One Web Component for your site. JSON Feed and RSS for AI agents. A tiny CLI for CI. Edge-native on Cloudflare Workers — no servers, no signup, no analytics in the read path.

Drop into any HTML
<script src="https://bearlychange.mtri-vo.workers.dev/widget/widget.js" defer></script> <bearly-change src="https://bearlychange.mtri-vo.workers.dev/api/changelog" limit="3"></bearly-change>
your-site.com Your product <BEARLY-CHANGE> Latest changes by bearlychange NEW · v0.4.0 Bearer-token auth Added BEARLYCHANGE_TOKEN for CI & agents IMPROVEMENT · v0.3.2 D1 backend on Cloudflare Edge-served reads, transactional writes FIX · v0.3.1 CSRF guard on admin POSTs /FEED.JSON { "items": [ { "id": "bc_…", "title": "…", "tags": […] }, … ] } AGENT poll & ship stable JSON contract
Why bearlychange

Built for both audiences from day one.

Most changelog tools serve humans. Some serve RSS. bearlychange treats agent-readable, embed-friendly, and edge-native as table stakes — not afterthoughts.

01

Embed-first

One Web Component, two lines of HTML. Works in Astro, Next, WordPress, plain HTML — anywhere custom elements run.

02

Agent-readable

JSON API, JSON Feed 1.1, RSS 2.0 — all from the same store. Stable field names, plus an optional machine_summary per entry.

03

Edge-native

Cloudflare Workers + D1. Reads served from the nearest region. No origin server, no cold starts, no infra to babysit.

04

Stable output

Schema-validated entries. Predictable JSON shape. CI can rely on it; agents can integrate without scraping.

05

Draft → publish

Write entries as drafts, ship when ready. published_at auto-stamps on first publish. Edit and delete via HTML admin or CLI.

06

Open & self-hostable

MIT-licensed. Single Worker, single D1 database. Fits inside Cloudflare's free tier for typical volumes.

Reference

Every public surface, one page.

Hit any of these directly. Same data, different shape.

GET
/api/changelog
JSON of all published entries. Primary surface for agents and the embed widget.
GET
/feed.json
JSON Feed 1.1 — the spec most modern feed readers and AI tools understand.
GET
/rss.xml
RSS 2.0 for legacy readers and pipelines that still want XML.
GET
/entries/:slug
Per-entry page. Returns HTML by default; send Accept: application/json for the structured record.
GET
/widget/widget.js
The <bearly-change> Web Component. Cached at the edge.
Honest status. bearlychange is v0.1.0. The HTTP API, CLI, widget, validation, CSRF guard, bearer-token auth, and Workers+D1 backend are shipped and covered by an end-to-end smoke test. Multi-project workspaces and webhook integrations are on the roadmap.
CLI

Post from CI. Pipe from an agent.

A zero-dependency Node script. Talks to any bearlychange deployment over HTTP.

1 — clone & install
git clone https://github.com/blacklogos/bearlychange cd bearlychange && npm install
2 — point at a deployment
export BEARLYCHANGE_URL=https://bearlychange.mtri-vo.workers.dev export BEARLYCHANGE_TOKEN=<your-token>
3 — create an entry (or pipe one from an agent)
./bin/bearlychange.mjs create \ --title "Bearer-token auth" --slug bearer-auth \ --summary "BEARLYCHANGE_TOKEN now accepted." --version 0.4.0 echo '{"title":"X","slug":"x","summary":"y","version":"0.4.1"}' \ | ./bin/bearlychange.mjs create --from-stdin
How it works

Post once, serve everyone.

Same entry, four surfaces. Agents and humans converge on the same source of truth.

1

Post

From the admin HTML form, the CLI, or a CI job. Validation runs server-side — bad slugs and versions get 400s.

2

Draft or publish

Drafts stay invisible to public surfaces. Publishing auto-stamps published_at and reveals the entry everywhere.

3

Serve from the edge

Cloudflare Workers handles every request. D1 holds the entries; reads run in the nearest region.

4

Consume anywhere

Humans see the widget on your site. Agents poll JSON. Feed readers subscribe to JSON Feed or RSS. Same data, four shapes.

Stack

Boring tech, on purpose.

RuntimeCloudflare Workers
DatabaseCloudflare D1
RouterHono
EmbedWeb Components
CLINode 22 · zero deps
LicenseMIT
FAQ

Frequently asked questions

Everything you need to know about bearlychange.

bearlychange is an embed-first changelog service for sites and AI agents. It ships a one-line Web Component embed, a JSON API at /api/changelog, a JSON Feed 1.1 endpoint, an RSS 2.0 feed, and per-entry HTML and JSON pages. It runs on Cloudflare Workers with D1 for storage and is MIT-licensed.

Add a <script> tag pointing at /widget/widget.js and a <bearly-change> custom element with a src attribute pointing at /api/changelog. The widget is a plain Web Component, so it works in Astro, Next, WordPress, Webflow, or any framework that allows custom elements. No build step required.

bearlychange exposes a stable JSON API at /api/changelog and a JSON Feed 1.1 surface at /feed.json. Both return only published entries with predictable field names (id, slug, title, summary, type, version, modules, machine_summary, published_at). Each entry also includes an optional machine_summary field intended for LLM consumption.

Use the bearlychange CLI with a bearer token. Set BEARLYCHANGE_URL and BEARLYCHANGE_TOKEN as env vars, then run bc create --title "Shipped X" --slug shipped-x --summary "What changed" --version 0.4.0. For agents, pipe a JSON payload into bc create --from-stdin. The CLI is a zero-dependency Node script and works against any bearlychange deployment.

No account is required to read public changelog entries — the JSON API, JSON Feed, RSS, per-entry pages, and embed widget are open. Admin write access (create, update, delete) requires Basic auth credentials or a bearer token configured on the deployed instance. There is no signup, no email collection, and no analytics on the public surface.

bearlychange stores entries in Cloudflare D1, a serverless SQLite database that replicates to the edge. Reads run from the same region as the request; writes go through the primary. Data never leaves Cloudflare's network.

Yes. Clone the repo, run wrangler d1 create your-db, update wrangler.toml with the database ID, run the migration in migrations/0001_initial.sql, set ADMIN_PASS and BEARLYCHANGE_TOKEN secrets, and run wrangler deploy. The whole stack is Cloudflare Workers + D1 + Hono.

Five public read surfaces: /api/changelog (JSON, all published entries), /feed.json (JSON Feed 1.1), /rss.xml (RSS 2.0), /entries/:slug (HTML by default, JSON when Accept: application/json), and /widget/widget.js (the Web Component embed).

bearlychange is at version 0.1.0 and in early production use. The HTTP API, CLI, embed widget, validation, CSRF guard, bearer-token auth, and Workers+D1 backend are all implemented and covered by an end-to-end smoke test (24 assertions, ~15s). Multi-project workspaces, webhook integrations, and per-entry discussions are on the roadmap but not yet shipped.

Yes. bearlychange is MIT-licensed and the full source is on GitHub at github.com/blacklogos/bearlychange. The hosted instance is free to read from. Self-hosting on Cloudflare Workers + D1 fits within Cloudflare's free tier for typical changelog volumes.