AnswerLens

AnswerLens is a CLI-first AI visibility auditor for product websites. CI for AI discoverability.

Language: English / 简体中文

Starter bundle

The starter bundle is the public adoption asset for external repositories.

Use this page when you want to explain the AnswerLens GitHub Action path before sending someone into raw repo files. It keeps the external layout, artifact order, runtime defaults, and next step visible in one place.

External repo shape

Copy this layout

.github/
  answerlens/
    brand.yaml
    competitors.yaml
    prompts.yaml
    runtime.yaml
  workflows/
    answerlens.yml

This is the same layout used by examples/consumer-repo.

File roles

What each file does

  • brand.yaml: product name, domain, proof-page hints, and optional site_display_name.
  • competitors.yaml: the declared comparison set for the category you actually sell into.
  • prompts.yaml: buyer, comparison, and citation questions for your real audience.
  • runtime.yaml: non-secret eval defaults for provider, model, locale, samples, timeout, and optional base URL.
  • answerlens.yml: the GitHub Action workflow that runs the same artifact contract in CI.

Keep API keys in GitHub secrets or local environment variables, not in runtime.yaml.

Review flow

Artifact review order

  1. share-summary.md
  2. scorecard.md
  3. recommendations.md

Then use pr-snippet.md for GitHub copy and run.json for machine-readable metadata.

Public proof

Starter example run

Example site: Example Product public site

This public example uses the consumer-repo starter bundle against the stable fixture so external adopters can inspect the resulting artifacts before wiring their own site.

Activation path

What to do next

  1. Run a 5-minute real-site audit if you have not done that yet.
  2. Copy the starter files into the repository you want to audit.
  3. Set non-secret eval defaults in runtime.yaml and keep API keys in secrets.
  4. If you want the lowest-friction first eval before hand-tuning fields, start with profile: fast-first-eval.
  5. If you already have one readable OpenAI baseline and want a search-shaped second opinion, use profile: perplexity-cross-check as a temporary override.
  6. Move into the GitHub Action path when the local run already feels reviewable.

That keeps the starter bundle positioned as proof of adoption readiness, not as a separate product surface.

What this connects to

Related proof pages

  • Examples: see the live demo artifact set first.
  • Docs: activation references, scoring notes, and canonical Markdown.
  • Integrations: see how the starter bundle fits into the GitHub-native workflow.
  • Releases: use release assets as the second public front door.