Testing
Vitek Plugin includes a comprehensive test suite using Vitest.
Overview
- Framework: Vitest
- Coverage: v8 provider
- Total Tests: 133
- Coverage Report: View Online
Running Tests
Run once
bash
pnpm testRun in watch mode (development)
bash
pnpm test:watchRun with coverage
bash
pnpm test:coverageCoverage Thresholds
We maintain high code quality standards with the following coverage thresholds:
| Metric | Minimum | Current |
|---|---|---|
| Lines | 55% | — |
| Functions | 60% | — |
| Branches | 50% | — |
| Statements | 55% | — |
Note: Pull requests that reduce coverage below these thresholds will fail CI checks.
Test Structure
Unit tests (src/)
Unit tests for the plugin core live in src/ alongside the source code:
src/
├── core/
│ ├── introspection/
│ │ └── manifest.test.ts # getManifest, getRoutes, getSockets, writeManifest
│ ├── middleware/
│ │ └── compose.test.ts # Middleware composition tests
│ ├── normalize/
│ │ └── normalize-path.test.ts # Path normalization tests
│ └── routing/
│ ├── route-matcher.test.ts # Route matching tests
│ └── route-parser.test.ts # Route parsing tests
├── plugin/
│ ├── vitek-config.test.ts # Config plugin (optimizeDeps, alias)
│ └── ...
└── shared/
├── errors.test.ts # Error classes tests
└── response-helpers.test.ts # HTTP response testsPost-build tests (examples/)
Each example includes post-build.test.ts (or .js) that runs after vite build to verify:
- Generated files exist (
api.services.*,api.types.*,socket.services.*) dist/vitek-api.mjsanddist/vitek-sockets.mjsbundles load correctly- Routes and middlewares are exported as expected
Run: pnpm run build && pnpm test from each example directory. The build-and-test.sh script runs this for all examples (including rate-limit).
E2E test
An end-to-end test builds the plugin and the basic-js example, starts vitek-serve, and sends GET and POST requests to the API:
bash
pnpm test:e2eThis runs scripts/e2e.mjs. Ensure the plugin is buildable and that examples/basic-js can be built (e.g. after pnpm build at repo root).
Benchmark
A simple benchmark script sends many requests to a URL and reports latency (p50, p99) and throughput (req/s):
bash
pnpm bench
# or with custom URL and count:
node scripts/bench.mjs http://127.0.0.1:3000/api/health 5000
# or build basic-js, start server, run benchmark, run example tests, then stop:
pnpm run example:bench
node scripts/bench.mjs --with-example 2000Default URL is http://127.0.0.1:3000/api/health and default count is 1000. Start a server (e.g. from an example) before running the benchmark, or use pnpm run example:bench to do everything with the basic-js example.
Writing Tests
Tests follow the naming convention: [filename].test.ts
Example Test
typescript
import { describe, it, expect } from 'vitest';
import { parseRouteFile } from './route-parser.js';
describe('parseRouteFile', () => {
it('should parse a simple GET route', () => {
const result = parseRouteFile('/api/health.get.ts', '/api');
expect(result).toEqual({
method: 'get',
pattern: 'health',
params: [],
file: '/api/health.get.ts',
});
});
});Continuous Integration
Tests run automatically on every Pull Request via GitHub Actions:
- Workflow:
.github/workflows/pr_tests_check.yml - Triggers: PR opened, reopened, synchronized
- Requirements: All tests must pass + coverage thresholds met
Coverage Report
The coverage report is automatically generated and deployed to GitHub Pages on every push to main:
You can also find the link in the top navigation menu of the documentation.
Configuration
Test configuration is in vitest.config.ts:
typescript
export default defineConfig({
test: {
name: 'vitek-plugin',
globals: true,
environment: 'node',
coverage: {
provider: 'v8',
reporter: ['text', 'json', 'html'],
thresholds: {
lines: 55,
functions: 60,
branches: 50,
statements: 55,
},
},
},
});Best Practices
- Write tests for new features - Every new feature should include tests
- Maintain coverage - Don't let coverage drop below thresholds
- Test edge cases - Include tests for error conditions and edge cases
- Use descriptive names - Test descriptions should clearly state what's being tested
- Keep tests focused - One logical assertion per test when possible
Contributing
When contributing:
- Ensure all tests pass:
pnpm test - Check coverage:
pnpm test:coverage - Add tests for new functionality
- Don't break existing tests without good reason
See Contributing Guide for more details.
