| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | 2026-05-04 | 2.3 kB | |
| Ruflo 3.6.27 -- Ollama provider (Tier-2) closes #1725 source code.tar.gz | 2026-05-04 | 95.6 MB | |
| Ruflo 3.6.27 -- Ollama provider (Tier-2) closes #1725 source code.zip | 2026-05-04 | 102.1 MB | |
| Totals: 3 Items | 197.7 MB | 1 | |
Single-issue release closing [#1725]. Validated end-to-end against ruvultra running ollama on a Tailscale tailnet.
What's new
Ollama is now a first-class provider — Tier-2 in the 3-tier model routing per ADR-026. End users on Anthropic Max plans (which don't expose `ANTHROPIC_API_KEY`) plus Ollama Cloud subscribers can finally use `workflow_execute` and any other agent path that routes through `callAnthropicMessages`.
Configure
:::bash
# Ollama Cloud
ruflo providers configure -p ollama -k \$OLLAMA_API_KEY
ruflo providers test -p ollama # round-trips the key
# Self-hosted Ollama (this release validated against ruvultra over Tailscale)
export OLLAMA_API_KEY=local # sentinel — skip auth header
export OLLAMA_BASE_URL=http://ruvultra:11434 # any Ollama endpoint
export RUFLO_PROVIDER=ollama # or unset ANTHROPIC_API_KEY
Provider selection logic
`callAnthropicMessages` now picks Ollama when:
- `RUFLO_PROVIDER=ollama` (explicit), OR
- `ANTHROPIC_API_KEY` unset AND `OLLAMA_API_KEY` set (auto-fallback)
Falls back to Anthropic when the env is right for it. Response shape is normalized to `AnthropicCallResult` either way — callers see `{ success, model, output, usage, durationMs }` regardless of provider.
Validation against ruvultra (Tailscale)
- Installed ollama 0.23.0 on ruvultra (Ubuntu 24.04)
- Daemon bound to `0.0.0.0:11434`
- Pulled `llama3.2:1b`
- From local Mac, called `callAnthropicMessages` over Tailscale → 611ms round-trip, 49 tokens
- Response normalized correctly
Packages
| Package | Version |
|---|---|
| `@claude-flow/cli` | 3.6.27 |
| `claude-flow` (umbrella) | 3.6.27 |
| `ruflo` (umbrella) | 3.6.27 |
Verify
```bash ruflo verify
→ 55 fixes / 55 verified
```
Known gap (upstream)
`agentic-flow config-wizard` still hardcodes 4 providers (anthropic / openrouter / gemini / onnx). That's an upstream wedge in the `agentic-flow` npm package — out of scope for this repo. Workaround: set `OLLAMA_API_KEY` directly in the environment (not via `agentic-flow config set`); ruflo's runtime picks it up regardless of what the wizard thinks.