Multi-Provider Orchestration
One workflow definition. Any LLM provider. Zero lock-in.
The LLM Market in 2025
Enterprise LLM adoption has shifted dramatically. Based on Menlo Ventures research:
40%
Anthropic
claude-opus-4, claude-sonnet-4-5
27%
OpenAI
gpt-4o, gpt-4-turbo
21%
gemini-2.0-flash, gemini-1.5-pro
Growing
Local (Ollama)
llama3.2, mistral
Key insight: No single provider dominates. Enterprise teams use 2-3 providers on average. Your workflow tool should support this reality.
How It Works in Nika
# Define providers once at the top
providers:
anthropic:
type: anthropic
apiKeyEnv: ANTHROPIC_API_KEY
openai:
type: openai
apiKeyEnv: OPENAI_API_KEY
local:
type: ollama
baseUrl: http://localhost:11434
# Use different models per task
tasks:
- id: analyze
agent:
prompt: "Deep code analysis"
model: claude-opus-4 # Best for reasoning
provider: anthropic
- id: generate-docs
agent:
prompt: "Write documentation"
model: gpt-4o # Good for writing
provider: openai
- id: quick-check
agent:
prompt: "Quick syntax check"
model: llama3.2 # Free, fast, local
provider: localEach task can use a different provider and model. The runner handles authentication, API differences, and response normalization transparently.
Why This Matters
No Vendor Lock-in
Switch providers without rewriting workflows. Same YAML, different model.
Cost Optimization
Use cheaper models for simple tasks, powerful ones for complex reasoning.
Resilience
If one provider is down, failover to another automatically.
Best Tool for the Job
Claude for coding, GPT for creative, Gemini for multimodal, local for privacy.
Future: Intelligent Routing (Research)
Based on the "Pick and Spin" research framework (arXiv, Dec 2024), we're exploring automatic model routing that optimizes for:
# Potential future syntax (not implemented yet)
agent:
prompt: "Complex analysis task"
routing:
strategy: auto # Let Nika choose
optimize:
- relevance # Best model for this task
- latency # Fastest response
- cost # Cheapest option
fallback: [claude-sonnet-4-5, gpt-4o]
# Research shows this can achieve:
# - 21.6% higher success rates
# - 30% lower latency
# - 33% lower cost per queryStatus: This is research territory. We're evaluating whether automatic routing adds enough value to justify the complexity. Your feedback matters.
Compatibility: Model Context Protocol (MCP)
Anthropic's MCP (announced Nov 2024) is becoming an industry standard for LLM tool integration. Nika supports MCP for tool invocation:
# Invoke MCP-compatible tools
tasks:
- id: search-docs
invoke:
reference: "mcp::docs-search::query"
args:
query: "authentication flow"
# Works with any MCP server:
# - Anthropic's reference servers
# - Third-party integrations
# - Your custom MCP serversMulti-provider support is live
Use any model in your workflows.