Providers
testpilot-ai supports multiple LLM providers through aiclientjs.
Supported Providers
| Provider | Models | Env Variable | Quality |
|---|---|---|---|
| OpenAI | gpt-4o, gpt-4o-mini, o1, etc. | OPENAI_API_KEY | Excellent |
| Anthropic | claude-sonnet-4-20250514, claude-haiku, etc. | ANTHROPIC_API_KEY | Excellent |
| gemini-pro, gemini-1.5-pro, etc. | GOOGLE_API_KEY | Very Good | |
| Ollama | llama3, codellama, mistral, etc. | None (local) | Good |
OpenAI
export OPENAI_API_KEY=sk-...
npx testpilot src/utils.ts --provider openai --model gpt-4o
Recommended models:
gpt-4o— Best quality, higher costgpt-4o-mini— Good balance of quality and cost
Anthropic
export ANTHROPIC_API_KEY=sk-ant-...
npx testpilot src/utils.ts --provider anthropic --model claude-sonnet-4-20250514
Recommended models:
claude-sonnet-4-20250514— Excellent for code generationclaude-haiku— Faster, lower cost
Google
export GOOGLE_API_KEY=...
npx testpilot src/utils.ts --provider google --model gemini-1.5-pro
Ollama (Local)
No API key needed. Install Ollama and pull a model:
ollama pull llama3
npx testpilot src/utils.ts --provider ollama --model llama3
Recommended local models:
llama3— Good general-purpose code understandingcodellama— Optimized for code tasks
tip
For the best test quality, use gpt-4o or claude-sonnet-4-20250514. Local models work but may produce occasional syntax errors or incorrect assertions.
Passing API Keys
Three ways to provide your API key:
-
Environment variable (recommended):
export OPENAI_API_KEY=sk-... -
CLI flag:
npx testpilot src/utils.ts --api-key sk-... -
Config file:
{
"provider": "openai",
"apiKey": "sk-..."
}
caution
Never commit API keys to version control. Use environment variables or .env files (add .env to .gitignore).