Integrations
AnswerLens integrations stay GitHub-native and artifact-backed.
The integration surface is intentionally narrow: keep the core audit contract stable, add eval providers when you need them, and layer validation imports on top without turning the project into a dashboard-first SaaS.
What ships now
Current integration surfaces
| Integration surface | What it does |
| GitHub Action | Runs the same artifact contract in pull requests, workflow_dispatch runs, and artifact uploads. |
| OpenAI and Perplexity eval | Adds eval-mode benchmarking when you want answer quality checks on top of audit. |
| Search Console import | Validates key-page evidence against imported page-level Search Console exports. |
| Bing / IndexNow helper | Adds helper-mode validation and candidate URL preparation without live submission. |
| Release assets and Pages | Turns demo outputs and docs into reusable public distribution surfaces. |
Suggested path
How teams usually adopt
- Open the live demo report.
- Run the 60-second fixture demo.
- Run one real-site audit locally.
- Move the same artifact contract into the GitHub Action.
That sequencing keeps integrations understandable and reviewable instead of turning each surface into a separate product.
External adoption
Starter bundle
The external-repo path is public and copyable, not hidden in internal fixtures.
Use the starter bundle overview when you need a citable explanation of the .github/answerlens/ layout before handing someone the raw example files.
That keeps the Action path legible for forks, releases, and external setup guides.
What this connects to
Related proof pages
- FAQ: answer first-run workflow questions.
- Compare: explain how the GitHub-native path differs from dashboard-first products.
- Pricing: explain where Action and eval usage create variable cost.
- Security: explain secret handling and review trail expectations.
- Starter bundle: show the external-repo layout and artifact review order.
- Developer advocacy teams: connect integrations to docs and examples.