Support OpenAI as a fleet member LLM backend
Apra Fleet is an open-source MCP server
Brought to you by:
apralabs
Originally created by: kumaakh
Add support for OpenAI (GPT-4, o-series) as an alternative LLM backend for fleet members, alongside Claude.
OpenAI models are widely available and some users may prefer them for specific tasks or already have API access. Supporting OpenAI broadens the fleet's reach and allows mixed-model teams. Users should be able to mix and match — e.g., some members running Claude, others running OpenAI — within the same fleet.
execute_prompt should route to the appropriate backend CLI/API (Codex CLI or OpenAI API)provision_auth should support OpenAI API key provisioning (OPENAI_API_KEY)update_claude tool needs generalization (update_agent_cli or similar)prompt-errors.ts) needs OpenAI-specific patternsllmProvider field
Originally posted by: kumaakh
Covered by Codex CLI support (already shipped). Codex CLI is OpenAI's official agentic CLI tool — fleet registers these members with llmProvider: 'codex'. OpenAI backend support is available today via codex provider.
Ticket changed by: kumaakh