Skip to main content

Getting Started

Installation

npm install -D @amit641/testpilot-ai

Or use it directly with npx:

npx testpilot src/utils.ts

Prerequisites

  • Node.js 18+
  • An API key for your preferred LLM provider (or Ollama for local models)

Set Up Your API Key

# OpenAI
export OPENAI_API_KEY=sk-...

# Anthropic
export ANTHROPIC_API_KEY=sk-ant-...

# Google
export GOOGLE_API_KEY=...

# Ollama — no key needed (local)

Generate Your First Test

Given a file src/math.ts:

export function add(a: number, b: number): number {
return a + b;
}

export function divide(a: number, b: number): number {
if (b === 0) throw new Error('Cannot divide by zero');
return a / b;
}

Run:

npx testpilot src/math.ts --provider openai

This generates src/math.test.ts:

import { describe, it, expect } from 'vitest';
import { add, divide } from './math';

describe('add', () => {
it('adds two positive numbers', () => {
expect(add(2, 3)).toBe(5);
});

it('adds negative numbers', () => {
expect(add(-1, -2)).toBe(-3);
});

it('handles zero', () => {
expect(add(0, 5)).toBe(5);
});
});

describe('divide', () => {
it('divides two numbers', () => {
expect(divide(10, 2)).toBe(5);
});

it('throws on division by zero', () => {
expect(() => divide(10, 0)).toThrow('Cannot divide by zero');
});
});

Generate Tests for a Directory

npx testpilot src/helpers/

This generates test files for every .ts, .tsx, .js, .jsx file in the directory (skipping existing test files).

Using Local Models

If you have Ollama installed:

npx testpilot src/utils.ts --provider ollama --model llama3

No API key needed — everything runs locally.