Every chat response returns a verdict — SAFE (grounded), PARTIAL (some claims unverified), UNVERIFIED (no sources matched), or CONFLICT (contradicts a source). The verdict is a pill you click to open the evidence drawer. No black-box "trust me".
Every answer from the chat, every inline suggestion, carries a verdict pill and a trust score. Click the pill to open a drawer that lists each claim, which source chunk supports it, and where the model drifted. When the shape looks hallucinated — confident absolutes, invented citations, suspiciously precise numbers — the verdict downgrades before the answer ever reaches you.
Free tier 500 requests/month · no credit card · 30-second install
Every chat response returns a verdict — SAFE (grounded), PARTIAL (some claims unverified), UNVERIFIED (no sources matched), or CONFLICT (contradicts a source). The verdict is a pill you click to open the evidence drawer. No black-box "trust me".
A numeric support_score from 0 to 1 measures the fraction of claims in the answer that are supported by the retrieved sources. Independent of the model's stated confidence. The same score the API returns is what the extension surfaces — no dumbing down.
An internal classifier reads the shape of the answer — absolute language, fake citations like [Smith 2024], suspiciously precise statistics — and downgrades the verdict if the shape contradicts the grounding. F1 of 0.97 on a 50-example offline benchmark.
Ghost-text suggestions as you type. Configurable per-language. Fast path: 15s timeout, streams through the router, cancelled the moment you keep typing. When the completion cites external facts, you see the verdict on the companion chat message.
Command palette or sidebar. Every reply renders a verdict pill. Click it: full breakdown of claims (supported / contradicted / unverified) with the source chunk excerpt and a similarity score. The drawer is the same JSON the API returns, rendered natively.
Attach open files, the active selection, or a workspace folder as context. The extension streams them to the tenant-scoped RAG index on the backend, then queries with your prompt. Nothing persisted beyond your session unless you explicitly upload.
Marketplace install points to the RapidAPI gateway by default (BYO key). Enterprise / self-host users can switch to a direct api.wauldo.com endpoint with a tig_live_* bearer and hit the same backend — same features, same verdicts.
# 1. Install the extension
code --install-extension wauldo.wauldo-code
# 2. Grab a free key — BASIC tier, 500 requests/month
open https://wauldo.com/pricing
# 3. Paste the key in the extension: "Wauldo: Configure API Key"
# → Done. Open a file. Type. The verdict pill does the rest.
Keys are stored in the OS keychain via VS Code SecretStorage — never written to settings.json.
500 requests / month. Full verdict pill, full drawer, full shape-downgrade. No feature gating — just a monthly quota. Good for prototypes, weekend projects, evaluation.
10,000 requests / month. Solo developers shipping real features. Auto-rollover on quota exhaustion with clear in-extension messaging — no silent failures.
100,000 requests / month. Recommended. Small teams, heavy RAG use, CI integrations. Pay-per-request above quota at $0.008 / request (MEGA overage plan).
Full pricing table → · Enterprise / self-host off-platform: contact@wauldo.com
All three produce answers. None report how grounded the answer is. Wauldo's extension surfaces the same support score and claim-level drawer the API returns, so you can tell a hedged, well-grounded suggestion apart from a confidently hallucinated one — without running the code first. For everything else — autocomplete speed, IDE integration, language coverage — it's competitive, not differentiated.
Yes. The extension is dual-mode: point wauldo.apiUrl to your own api.wauldo.com equivalent and supply a tig_live_* bearer. The backend is a Rust workspace — deployable on Fly.io, Render, or bare metal. Enterprise licensing and deployment support off-platform.
By default: your prompt, any files you attach or select, and metadata (language, request ID). Nothing else is read from your workspace. API keys live in VS Code SecretStorage (OS keychain), not in settings. Telemetry is off by default — first launch shows a modal asking explicit consent before any anonymous usage event is sent.
Streaming chat: no perceived latency — tokens arrive as usual, the verdict pill fills in once the response is complete. Inline completions: the source-grounded check runs in parallel with the generation, so the suggestion appears as fast as any other assistant. The shape-downgrade classifier is sub-millisecond after warmup.
The extension fails loudly, not silently. Invalid key: clear error with a link to paste a new one. API down: the status pill turns yellow and the chat shows a retry banner. No cached answers served with stale verdicts — a wrong score is worse than no score.
Free tier. 500 verifications/month. No credit card. 30-second install.
$ code --install-extension wauldo.wauldo-code