Documentation
Integrations
Pretense is a drop-in proxy. Connect it to any AI coding tool by changing one setting.
Claude Code
RecommendedRoute every Claude Code session through Pretense by setting a single environment variable. All prompts are mutated before reaching the Anthropic API.
- 1Install and start the Pretense proxy: pretense start
- 2Set ANTHROPIC_BASE_URL in your shell profile or per-session
- 3Run Claude Code as normal -- mutation happens transparently
# Option 1: per-session ANTHROPIC_BASE_URL=http://localhost:9339 claude # Option 2: export in ~/.zshrc or ~/.bashrc (recommended) export ANTHROPIC_BASE_URL=http://localhost:9339
MCP Server (Claude Code Tools)
NewRegister the Pretense MCP server in Claude Code to unlock 4 native tools: scan, mutate, status, and audit. Claude can invoke these tools directly inside any conversation without leaving your editor.
- 1Install the MCP server: npx @pretense/mcp-server or pretense mcp install
- 2Add the server to your claude_desktop_config.json (see snippet below)
- 3Reload Claude Code -- the pretense_scan, pretense_mutate, pretense_status, pretense_audit tools appear automatically
- 4Ask Claude to scan a file or mutate code before sending it to any LLM
// ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"pretense": {
"command": "npx",
"args": ["@pretense/mcp-server"],
"env": { "PRETENSE_API_KEY": "your-key-here" }
}
}
}Cursor
Override the model base URL in Cursor settings to route requests through Pretense. Works with GPT-4o, Claude, and any OpenAI-compatible model Cursor supports.
- 1Open Cursor and go to Settings (Cmd+,)
- 2Navigate to Models, then find "Override Base URL"
- 3Set the value to http://localhost:9339
- 4Your existing API key works unchanged
# Settings > AI > Override Base URL http://localhost:9339 # Or set the env var before launching Cursor OPENAI_BASE_URL=http://localhost:9339 cursor
OpenAI SDK
Set a single environment variable before any Node.js or Python script that uses the OpenAI SDK. No code changes required.
- 1Start the Pretense proxy: pretense start
- 2Set OPENAI_BASE_URL to http://localhost:9339
- 3Run your script as normal -- the SDK routes through Pretense automatically
# Node.js / shell
OPENAI_BASE_URL=http://localhost:9339 node your-script.js
# Python
OPENAI_BASE_URL=http://localhost:9339 python your-script.py
# Or in code (TypeScript)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'http://localhost:9339',
apiKey: process.env.OPENAI_API_KEY,
});LangChain
Pass the Pretense proxy URL as the base_url when initializing any LangChain LLM or ChatModel. Works with LangChain Python and LangChain.js.
- 1Start the Pretense proxy: pretense start
- 2Pass base_url to your LLM initializer
- 3All chain and agent calls are now protected
# Python (LangChain + OpenAI)
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="http://localhost:9339",
api_key=os.environ["OPENAI_API_KEY"],
)
# Python (LangChain + Anthropic)
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(
anthropic_api_url="http://localhost:9339",
api_key=os.environ["ANTHROPIC_API_KEY"],
)GitHub Actions
Block PRs that contain secrets or unprotected AI calls before they reach your main branch. The Pretense CLI integrates with any CI system that runs Node.js.
- 1Add pretense to devDependencies or use npx
- 2Add a scan step to your workflow YAML
- 3The step exits non-zero if secrets are detected, failing the build
- 4Export scan-report.json as a CI artifact for compliance evidence
# .github/workflows/pretense-scan.yml
name: Pretense Scan
on: [pull_request]
jobs:
scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Pretense
run: curl -fsSL https://pretense.ai/install.sh | sh
- name: Scan for secrets
run: pretense scan ./src --format json --output scan-report.json
- name: Upload scan report
uses: actions/upload-artifact@v4
with:
name: pretense-scan-report
path: scan-report.jsonVS Code Extension
ExtensionThe Pretense VS Code extension adds a status bar indicator, a scan command, and a mutation map viewer. Configure the proxy URL in your workspace settings.
- 1Search for "Pretense" in the VS Code Extensions panel and install
- 2Add pretense.proxyUrl to your VS Code settings.json
- 3Reload VS Code -- the status bar shows live mutation count and proxy status
- 4Use Cmd+Shift+P and type "Pretense: Scan" to scan the active file
// .vscode/settings.json
{
"pretense.proxyUrl": "http://localhost:9339",
"pretense.autoScanOnSave": true,
"pretense.showStatusBar": true
}Need a different integration?
If your tool is not listed, email us and we will add official support. Pretense speaks OpenAI-compatible API format so most tools work already.