Understanding Plugin Ordering in ElizaOS
Plugin ordering in ElizaOS isn't just a technical detailβit's a sophisticated system that ensures your AI agents work reliably with the right providers for different tasks. Whether you're using Anthropic for chat, OpenAI for embeddings, or running everything locally with Ollama, understanding how plugins are ordered can save you from headaches and help you build more robust agents.
Why Plugin Order Mattersβ
Here's the thing: different AI providers support different capabilities. Anthropic's Claude is excellent for conversation but doesn't provide embeddings. OpenRouter gives you access to dozens of models but no embedding endpoint. OpenAI handles both chat and embeddings beautifully.
ElizaOS needs to know which provider should handle which type of request, and plugin ordering is how we solve this puzzle.
The Core Problemβ
When you call runtime.useModel(ModelType.TEXT_LARGE, params)
in your agent, ElizaOS needs to decide which provider should handle that request. If you have both Anthropic and OpenAI configured, which one should it choose? What about embeddingsβif you're using Anthropic for chat, where do embeddings come from?
The ElizaOS Solutionβ
Plugin array order determines priority. First plugins get first pick for handling model requests. But here's the clever part: ElizaOS automatically orders plugins to ensure:
- Your preferred text provider handles conversations
- Embedding-capable providers serve as fallbacks for embeddings
- Local providers kick in when cloud services fail
How It Works Under the Hoodβ
The useModel()
Systemβ
Every AI interaction in ElizaOS goes through the useModel()
system:
// Plugin registers a model handler during initialization
runtime.registerModel(
ModelType.TEXT_LARGE,
myModelHandler,
'my-plugin-name',
priority: 10 // Higher = more preferred
);
// Agent uses the model
const response = await runtime.useModel(ModelType.TEXT_LARGE, {
prompt: "What's the weather like?",
temperature: 0.7
});
ElizaOS sorts available handlers by priority, then by registration order. The highest-priority available handler wins.
Automatic Plugin Orderingβ
When you create a new project with elizaos create
, your src/character.ts
file automatically includes optimized plugin ordering:
export const character: Character = {
name: 'MyAgent',
plugins: [
// Core plugins first
'@elizaos/plugin-sql',
// Text-only plugins (no embedding support)
...(process.env.ANTHROPIC_API_KEY ? ['@elizaos/plugin-anthropic'] : []),
...(process.env.OPENROUTER_API_KEY ? ['@elizaos/plugin-openrouter'] : []),
// Embedding-capable plugins last (fallback for embeddings)
...(process.env.OPENAI_API_KEY ? ['@elizaos/plugin-openai'] : []),
...(process.env.OLLAMA_API_ENDPOINT ? ['@elizaos/plugin-ollama'] : []),
...(process.env.GOOGLE_GENERATIVE_AI_API_KEY ? ['@elizaos/plugin-google-genai'] : []),
// Local-AI fallback (only when no embedding providers exist)
...(!process.env.GOOGLE_GENERATIVE_AI_API_KEY &&
!process.env.OLLAMA_API_ENDPOINT &&
!process.env.OPENAI_API_KEY
? ['@elizaos/plugin-local-ai']
: []),
// Platform and bootstrap plugins
...(process.env.DISCORD_API_TOKEN ? ['@elizaos/plugin-discord'] : []),
...(!process.env.IGNORE_BOOTSTRAP ? ['@elizaos/plugin-bootstrap'] : []),
],
// ... rest of character config
};
Plugin Categories & Loading Orderβ
1. Core Infrastructureβ
@elizaos/plugin-sql
- Always loads first, provides database functionality
2. Text-Only AI Providersβ
@elizaos/plugin-anthropic
- Claude models (conversation only)@elizaos/plugin-openrouter
- Multiple AI models via OpenRouter
3. Platform Integrationsβ
@elizaos/plugin-discord
- Discord bot capabilities@elizaos/plugin-twitter
- Twitter integration@elizaos/plugin-telegram
- Telegram bot
4. Bootstrap Pluginβ
@elizaos/plugin-bootstrap
- Default actions, providers, and evaluators
5. Embedding-Capable AI Providers (Always Last)β
@elizaos/plugin-openai
- GPT models + embeddings@elizaos/plugin-ollama
- Local models + embeddings@elizaos/plugin-google-genai
- Gemini models + embeddings@elizaos/plugin-local-ai
- Final fallback when no other AI providers exist
Real-World Examplesβ
Scenario 1: Cost-Optimized Setupβ
Goal: Use Claude for chat (better quality), OpenAI for embeddings (only option)
Environment:
ANTHROPIC_API_KEY=your_claude_key
OPENAI_API_KEY=your_openai_key
Result: Anthropic handles text generation, OpenAI handles embeddings. No Local-AI needed.
Scenario 2: Privacy-First Local Setupβ
Goal: Everything runs locally, no cloud API calls
Environment:
OLLAMA_API_ENDPOINT=http://localhost:11434
# No other API keys
Result: Ollama handles both text generation and embeddings locally.
Scenario 3: High-Availability Productionβ
Goal: Multiple fallbacks for reliability
Environment:
OPENAI_API_KEY=primary_key
ANTHROPIC_API_KEY=backup_key
OLLAMA_API_ENDPOINT=http://localhost:11434
Result: OpenAI primary for everything, Anthropic backup for text, Ollama local fallback.
The Local-AI Safety Netβ
Here's a neat feature: Local-AI automatically loads as a fallback when you don't have any embedding-capable cloud providers. This ensures your agent always has access to embeddings (needed for memory and context) even with a text-only provider like Anthropic.
The logic is simple:
// Only load Local-AI if no embedding providers are configured
...(!process.env.GOOGLE_GENERATIVE_AI_API_KEY &&
!process.env.OLLAMA_API_ENDPOINT &&
!process.env.OPENAI_API_KEY
? ['@elizaos/plugin-local-ai']
: [])
Environment Variables That Matterβ
AI Provider Keysβ
ANTHROPIC_API_KEY
- Enables Claude (text-only)OPENROUTER_API_KEY
- Enables OpenRouter (text-only)OPENAI_API_KEY
- Enables OpenAI (text + embeddings)OLLAMA_API_ENDPOINT
- Enables Ollama (text + embeddings)GOOGLE_GENERATIVE_AI_API_KEY
- Enables Google GenAI (text + embeddings)
Platform Integrationβ
DISCORD_API_TOKEN
- Discord botTELEGRAM_BOT_TOKEN
- Telegram botTWITTER_API_KEY
+TWITTER_API_SECRET_KEY
+TWITTER_ACCESS_TOKEN
+TWITTER_ACCESS_TOKEN_SECRET
- Twitter (requires all 4)
Control Flagsβ
IGNORE_BOOTSTRAP=true
- Disables bootstrap plugin
Testing Your Configurationβ
Want to see how your plugins are ordered? Create a test project:
# Create new project
elizaos create test-agent
cd test-agent
# Set your environment variables
cp .env.example .env
# Edit .env with your API keys
# Check plugin order (it's logged during startup)
elizaos start --log-level debug
You can also run the plugin ordering tests:
cd packages/project-starter
bun test src/__tests__/character-plugin-ordering.test.ts
Troubleshooting Common Issuesβ
"Wrong provider handling my requests"β
Check: Plugin order in your character file. Text-only providers should load before embedding providers.
"Embedding operations failing"β
Verify: At least one embedding-capable provider (OpenAI, Ollama, Google GenAI, or Local-AI) is configured and loading.
"Local-AI loading unexpectedly"β
Solution: Check that your embedding-capable provider environment variables are set correctly.
"Plugin not loading"β
Debug: Verify environment variable names match expected patterns exactly.
Customizing Plugin Orderβ
While the default ordering works great for most cases, you can customize it for specific needs:
export const character: Character = {
name: 'CustomAgent',
plugins: [
'@elizaos/plugin-sql',
// Add your custom plugin early for high priority
'@my-org/my-custom-plugin',
// Standard ordering for AI providers
...(process.env.ANTHROPIC_API_KEY ? ['@elizaos/plugin-anthropic'] : []),
...(process.env.OPENAI_API_KEY ? ['@elizaos/plugin-openai'] : []),
// Rest of plugins...
],
// ... rest of config
};
You can also set custom priorities in your plugin:
const myPlugin: Plugin = {
name: 'my-custom-plugin',
models: {
[ModelType.TEXT_LARGE]: myCustomHandler
},
priority: 15 // Higher than default plugins
};
The Bigger Pictureβ
Plugin ordering in ElizaOS reflects a key architectural principle: intelligent defaults with full customization. The system works automatically based on your environment variables, but gives you complete control when you need it.
This approach scales from simple single-provider setups to complex multi-provider, multi-platform deployments. Whether you're building a personal assistant or a production-grade autonomous agent, the plugin ordering system ensures reliable, predictable behavior.
Getting Startedβ
Ready to build your own agent with optimal plugin ordering?
# Create a new project (includes optimized plugin ordering)
elizaos create my-agent
# Navigate to your project
cd my-agent
# Configure your environment
cp .env.example .env
# Edit .env with your API keys
# Start your agent
elizaos start
The template handles the complexity for you, so you can focus on building your agent's unique capabilities. But now you understand what's happening under the hoodβand can customize it when needed.
Plugin ordering might seem like a small detail, but it's this kind of thoughtful design that makes ElizaOS agents reliable and predictable in production. Happy building!