CodeCanary
Catch bugs, security issues, and quality problems before they land in main. Runs on every push or locally from the terminal.
────────────────────────────────────────────── Model Input Output Cost ────────────────────────────────────────────── claude-haiku-4-5 842 156 $0.0003 ────────────────────────────────────────────── Duration: 1.8s PR size: +24/-8 lines, 3 files
Built for real workflows
Not another AI gimmick. A review tool that fits into how you already work.
GitHub Actions native
Runs as a composite action on every push. Posts inline comments on exact diff lines. Zero config after setup.
Incremental reviews
Go-driven triage classifies existing threads at zero LLM cost. Only changed code gets re-evaluated.
Conversational
When authors reply to a finding, CodeCanary re-evaluates in context. It understands fixes, dismissals, and rebuttals.
Anti-hallucination
Explicit file allowlists, line validation against the diff, and distance thresholds prevent fabricated findings.
Cost-efficient
Fast triage model for thread re-evaluation, full model for review. Tracks per-invocation usage so you see what you spend.
Multi-provider
Bring your own LLM: Anthropic, OpenAI, OpenRouter, or Claude CLI. No vendor lock-in. New providers are easy to add.
Auto-resolution
When code is fixed, threads are auto-resolved. No stale reviews cluttering your PRs.
Local reviews
Review your changes from the terminal before pushing. Same engine, same findings, instant feedback.
Configuration-as-code
Project-specific rules, severity levels, ignore patterns, and context. Checked into your repo alongside the code.
How it works
Three commands. Automated reviews from there.
Install
One curl command installs the codecanary binary. Supports Linux and macOS, amd64 and arm64.
Setup
The interactive wizard configures your provider, stores your API key in the system keychain, and creates the config file.
Review
Run locally for instant feedback, or merge the GitHub Actions workflow for automated reviews on every push.
Configuration-as-code
Define rules, context, and ignore patterns in .codecanary/config.yml. Checked into your repo.
version: 1 provider: anthropic review_model: claude-sonnet-4-6 triage_model: claude-haiku-4-5-20251001 context: | Go REST API using chi router. Tests use testify. rules: - id: error-handling description: "Errors must be wrapped with context" severity: warning paths: ["**/*.go"] - id: sql-injection description: "Queries must use parameterized statements" severity: critical ignore: - "vendor/**" - "*.lock"
Bring your own LLM
No vendor lock-in. Pick the provider that works for you.
provider: openai review_model: gpt-5.4Adding a new provider →
Start reviewing in 30 seconds
Install, setup, review. That's it.