Test Command
Run and manage tests for ElizaOS projects and plugins
Test Command
Run tests for Eliza agent projects and plugins.
Usage
Arguments
Argument | Description |
---|---|
[path] | Optional path to project or plugin to test |
Options
Option | Description |
---|---|
-t, --type <type> | Type of test to run (choices: “component”, “e2e”, “all”, default: “all”) |
--port <port> | Server port for e2e tests |
--name <n> | Filter tests by name (matches file names or test suite names) |
--skip-build | Skip building before running tests |
--skip-type-check | Skip TypeScript type checking for faster test runs |
Usage
Arguments
Argument | Description |
---|---|
[path] | Optional path to project or plugin to test |
Options
Option | Description |
---|---|
-t, --type <type> | Type of test to run (choices: “component”, “e2e”, “all”, default: “all”) |
--port <port> | Server port for e2e tests |
--name <n> | Filter tests by name (matches file names or test suite names) |
--skip-build | Skip building before running tests |
--skip-type-check | Skip TypeScript type checking for faster test runs |
Examples
Basic Test Execution
Test Filtering
Advanced Options
Test Types
Component Tests
Location: __tests__/
directory
Framework: Vitest
Purpose: Unit and integration testing of individual components
End-to-End Tests
Location: e2e/
directory
Framework: Custom ElizaOS test runner
Purpose: Runtime behavior testing with full agent context
Test Structure
ElizaOS follows standard testing conventions with two main categories:
Component Tests (__tests__/
)
Component tests focus on testing individual modules, functions, and components in isolation.
End-to-End Tests (e2e/
)
E2E tests verify the complete flow of your agent with all integrations.
Test Configuration
Vitest Configuration
Component tests use Vitest, which is configured in your project’s vitest.config.ts
:
E2E Test Configuration
E2E tests can be configured via environment variables:
Coverage Reports
Generate and view test coverage:
Continuous Integration
Example GitHub Actions workflow:
Testing Best Practices
1. Test Organization
- Keep tests close to the code they test
- Use descriptive test names
- Group related tests with
describe
blocks - Follow the AAA pattern (Arrange, Act, Assert)
2. Test Isolation
- Each test should be independent
- Clean up resources after tests
- Use test fixtures for consistent data
- Mock external dependencies
3. Performance
- Use
--skip-build
during development for faster feedback - Run focused tests with
--name
filter - Use
--skip-type-check
for faster test runs when type safety is already verified - Parallelize tests when possible
4. Coverage Goals
- Aim for 80%+ code coverage
- Focus on critical paths
- Don’t sacrifice test quality for coverage
- Test edge cases and error scenarios
Common Testing Patterns
Testing Plugins
Testing Actions
Testing with Mock Data
Debugging Tests
Verbose Output
Running Specific Tests
Debugging in VS Code
Add to .vscode/launch.json
: