AHD · Setup
Install AHD.
Three install paths. The Nix flake is the reproducible one and is the path we test against. The npm path is the practical one for people not on Nix. The vision critic needs a Chromium binary; the Nix path ships one, the npm path asks you to point at the system install.
Path 1 · Nix flake
The flake ships Node, TypeScript and a Chromium build pinned to a
known-good version. On Linux it uses pkgs.chromium;
on macOS it uses the bundled Chromium from
pkgs.playwright-driver.browsers because
pkgs.chromium is not supported on darwin. Either way,
the shell exports AHD_CHROMIUM_PATH automatically so
ahd critique and ahd critique-url work
without further configuration.
git clone https://github.com/Ad-Astra-Computing/ahd
cd ahd
nix develop
npm install
npm run build
npm test Path 2 · npm without Nix
If Nix is not an option, install Node 20 or newer and go through
npm directly. The source linter and brief compiler work without
any browser. The vision critic and screenshot tools need a
Chromium binary; export AHD_CHROMIUM_PATH pointing at
it.
git clone https://github.com/Ad-Astra-Computing/ahd
cd ahd
npm install
npm run build # macOS
export AHD_CHROMIUM_PATH="/Applications/Google Chrome.app/Contents/MacOS/Google Chrome"
# Linux
export AHD_CHROMIUM_PATH="$(command -v chromium)"
AHD also auto-discovers a few canonical install paths before
falling back to the environment variable. If your binary lives at
one of these, AHD_CHROMIUM_PATH is optional.
- Linux
-
/run/current-system/sw/bin/chromium,/usr/bin/chromium,/usr/bin/chromium-browser,/opt/homebrew/bin/chromium - macOS
-
/Applications/Chromium.app/Contents/MacOS/Chromium,/Applications/Google Chrome.app/Contents/MacOS/Google Chrome, plus the per-user~/Applications/variants
Path 3 · As a library
If you only want the linter in CI, install AHD as a dev dependency. No browser, no API key.
npm install --save-dev ahd
npx ahd lint dist/**/*.html dist/**/*.css Environment variables
AHD_CHROMIUM_PATH-
Path to a Chromium or Chrome binary. Required for
ahd critique,ahd critique-urland any flow that renders a screenshot. Set automatically by the Nix devShell. ANTHROPIC_API_KEY-
Required for
--critic anthropic, which is the default. Pass--critic mockto skip the live call for local development. CF_API_TOKEN+CF_ACCOUNT_ID-
Required for any
cf:orcfimg:model spec (Workers AI text and image models). -
OPENAI_API_KEY,GEMINI_API_KEY,GOOGLE_API_KEY -
Only needed for the corresponding
gpt-*,o*,gemini-*model specs inahd eval-live. CF_AI_GATEWAY-
Optional. Set to
<account>/<gateway>to route frontier providers through Cloudflare's AI Gateway for caching, spend tracking and rate-limit observability.
Per-project .ahd.json
Projects that consume AHD can ship a .ahd.json at
their repo root. The file declares rule severity overrides, and
every override is required to carry a reason string.
Silent disables are rejected at load time: AHD's argument is that
a project should be able to say no, that rule does not apply
to us and explain why, the same way a linter's
// eslint-disable comment carries a pragma.
.ahd.json at the project root.{
"project": "dispatch",
"overrides": [
{
"ruleId": "ahd/require-type-pairing",
"severity": "warn",
"reason": "Dispatch uses a deliberate single-family aesthetic across subpages."
}
]
} ahd lint auto-discovers .ahd.json or
ahd.config.json in the current working directory.
Pass --config <path> to point at a different
file. Every applied override is printed in the lint footer and
recorded in the JSON report, so a reviewer can see which rules
the project chose not to enforce and the declared reason.
GitHub Actions
AHD ships four example workflows in
.github/workflows/. Copy the one that matches the
integration you want.
ahd-lint.example.yml-
Lint as a CI gate. No deploy, no provenance file. The cheapest
integration; runs
ahd linton every push and fails the build on any error-severity rule. ahd-nix.example.yml- Same as above but inside the flake devShell, so it picks up the pinned Node, TypeScript and Chromium. Use when the project already uses Nix.
ahd-deploy.example.yml-
The canonical deploy workflow for an Astro / Vite / Next site
that wants AHD provenance on every push. Plain Node.
npm run buildis expected to end innode scripts/ahd-provenance.mjs, which writesdist/ahd.json. Wrangler deploys to Cloudflare Pages. Source lint only. The vision critic does not run here. ahd-deploy-with-vision.example.yml-
The full pipeline. Uses Determinate Systems'
nix-installer-action+magic-nix-cache-actionto load the flake devShell (Node, TypeScript, Chromium) in seconds on each run. Serves the built dist on localhost, runsahd critique-urlagainst the preview, fails the job on any vision-rule fire, then deploys. Recommended only when the project's aesthetic contract justifies gating on rendered- pixel slop on every push.
Both deploy workflows need CLOUDFLARE_API_TOKEN in
secrets. Recommended: an Account API
Token (prefix cfat_) scoped narrowly to
Cloudflare Pages : Edit on the one account that
hosts your Pages project. Narrow blast radius if the token
leaks; carries account context so accountId is
optional. A User API Token (prefix
cfut_) also works, but because it spans multiple
accounts it needs accountId supplied alongside.
The examples populate accountId either way. The
account ID itself is not sensitive (it appears in dashboard URLs)
so you can store it in either secrets or repository variables.
The vision workflow also needs ANTHROPIC_API_KEY.
Dependabot
The AHD repo ships a .github/dependabot.yml that
bumps nix flake inputs, npm
dependencies and github-actions versions weekly.
Dependabot's native Nix flake support lets flake inputs move in
the same PR cadence as npm. Copy the file into your own repo if
you want the same weekly cadence.
Verify your install
One command exercises the source linter, the taxonomy, the token library and the MCP surface in one pass. No API keys, no browser, no network.
npx ahd try briefs/landing.yml
Writes out/landing.html using the offline
mock-swiss runner, lints it in place, prints the
report. If that works, the install is clean.
Adjacent reading: usage, positioning, verify a site's provenance.