ElizaOS uses a plugin-based architecture for integrating different Language Model providers. This guide explains how to configure and use LLM plugins, including fallback mechanisms for embeddings and model registration.

Key Concepts

Model Types

ElizaOS supports three types of model operations:

  1. TEXT_GENERATION - Generating conversational responses
  2. EMBEDDING - Creating vector embeddings for memory and similarity search
  3. OBJECT_GENERATION - Structured output generation (JSON/XML)

Plugin Capabilities

Not all LLM plugins support all model types:

PluginText GenerationEmbeddingsObject Generation
OpenAI
Anthropic
Google GenAI
Ollama
OpenRouter
Local AI

Plugin Loading Order

The order in which plugins are loaded matters significantly. From the default character configuration:

plugins: [
  // Core plugins first
  '@elizaos/plugin-sql',

  // Text-only plugins (no embedding support)
  ...(process.env.ANTHROPIC_API_KEY ? ['@elizaos/plugin-anthropic'] : []),
  ...(process.env.OPENROUTER_API_KEY ? ['@elizaos/plugin-openrouter'] : []),

  // Embedding-capable plugins last (lowest priority for embedding fallback)
  ...(process.env.OPENAI_API_KEY ? ['@elizaos/plugin-openai'] : []),
  ...(process.env.OLLAMA_API_ENDPOINT ? ['@elizaos/plugin-ollama'] : []),
  ...(process.env.GOOGLE_GENERATIVE_AI_API_KEY ? ['@elizaos/plugin-google-genai'] : []),
  
  // Fallback when no other LLM is configured
  ...(!process.env.GOOGLE_GENERATIVE_AI_API_KEY &&
      !process.env.OLLAMA_API_ENDPOINT &&
      !process.env.OPENAI_API_KEY
    ? ['@elizaos/plugin-local-ai']
    : []),
]

Understanding the Order

  1. Text-only plugins first - Anthropic and OpenRouter are loaded first for text generation
  2. Embedding-capable plugins last - These serve as fallbacks for embedding operations
  3. Local AI as ultimate fallback - Only loads when no cloud providers are configured

Model Registration

Each LLM plugin registers its models with the runtime during initialization:

// Example from a plugin's init method
runtime.registerModel({
  type: ModelType.TEXT_GENERATION,
  handler: generateText,
  provider: 'openai',
  priority: 1
});

Priority System

Models are selected based on:

  1. Explicit provider - If specified, uses that provider’s model
  2. Priority - Higher priority models are preferred
  3. Registration order - First registered wins for same priority

Embedding Fallback Strategy

Since not all plugins support embeddings, ElizaOS uses a fallback strategy:

// If primary plugin (e.g., Anthropic) doesn't support embeddings,
// the runtime will automatically use the next available embedding provider

Common Patterns

Anthropic + OpenAI Fallback

{
  "plugins": [
    "@elizaos/plugin-anthropic",  // Primary for text
    "@elizaos/plugin-openai"       // Fallback for embeddings
  ]
}

OpenRouter + Local Embeddings

{
  "plugins": [
    "@elizaos/plugin-openrouter",  // Cloud text generation
    "@elizaos/plugin-ollama"        // Local embeddings
  ]
}

Configuration

Environment Variables

Each plugin requires specific environment variables:

# .env file

# OpenAI
OPENAI_API_KEY=sk-...
OPENAI_SMALL_MODEL=gpt-4o-mini          # Optional: any available model
OPENAI_LARGE_MODEL=gpt-4o               # Optional: any available model

# Anthropic  
ANTHROPIC_API_KEY=sk-ant-...
ANTHROPIC_SMALL_MODEL=claude-3-haiku-20240307    # Optional: any Claude model
ANTHROPIC_LARGE_MODEL=claude-3-5-sonnet-latest   # Optional: any Claude model

# Google GenAI
GOOGLE_GENERATIVE_AI_API_KEY=...
GOOGLE_SMALL_MODEL=gemini-2.0-flash-001  # Optional: any Gemini model
GOOGLE_LARGE_MODEL=gemini-2.5-pro-preview-03-25  # Optional: any Gemini model

# Ollama
OLLAMA_API_ENDPOINT=http://localhost:11434/api
OLLAMA_SMALL_MODEL=llama3.2              # Optional: any local model
OLLAMA_LARGE_MODEL=llama3.1:70b          # Optional: any local model
OLLAMA_EMBEDDING_MODEL=nomic-embed-text  # Optional: any embedding model

# OpenRouter
OPENROUTER_API_KEY=sk-or-...
OPENROUTER_SMALL_MODEL=google/gemini-2.0-flash-001  # Optional: any available model
OPENROUTER_LARGE_MODEL=anthropic/claude-3-opus      # Optional: any available model

# Local AI (no API key needed)

Important: The model names shown are examples. You can use any model available from each provider.

Character-Specific Secrets

You can also configure API keys per character:

{
  "name": "MyAgent",
  "settings": {
    "secrets": {
      "OPENAI_API_KEY": "sk-...",
      "ANTHROPIC_API_KEY": "sk-ant-..."
    }
  }
}

Available Plugins

Cloud Providers

Local/Self-Hosted

Best Practices

1. Always Configure Embeddings

Even if your primary model doesn’t support embeddings, always include a fallback:

{
  "plugins": [
    "@elizaos/plugin-anthropic",
    "@elizaos/plugin-openai"  // For embeddings
  ]
}

2. Order Matters

Place your preferred providers first, but ensure embedding capability somewhere in the chain.

3. Test Your Configuration

Verify all model types work:

// The runtime will log which provider is used for each operation
[AgentRuntime][MyAgent] Using model TEXT_GENERATION from provider anthropic
[AgentRuntime][MyAgent] Using model EMBEDDING from provider openai

4. Monitor Costs

Different providers have different pricing. Consider:

  • Using local models (Ollama) for development
  • Mixing providers (e.g., OpenRouter for text, local for embeddings)
  • Setting up usage alerts with your providers

Troubleshooting

”No model found for type EMBEDDING”

Your configured plugins don’t support embeddings. Add an embedding-capable plugin:

{
  "plugins": [
    "@elizaos/plugin-anthropic",
    "@elizaos/plugin-openai"  // Add this
  ]
}

“Missing API Key”

Ensure your environment variables are set:

# Check current environment
echo $OPENAI_API_KEY

# Or use the CLI
elizaos env edit-local

Models Not Loading

Check plugin initialization in logs:

Success: Plugin @elizaos/plugin-openai initialized successfully

Migration from v0.x

In ElizaOS v0.x, models were configured directly in character files:

// ❌ OLD (v0.x) - No longer works
{
  "modelProvider": "openai",
  "model": "gpt-4"
}

// ✅ NEW (v1.x) - Use plugins
{
  "plugins": ["@elizaos/plugin-openai"]
}

The modelProvider field is now ignored. All model configuration happens through plugins.