Support Gemini as a fleet member LLM backend
Apra Fleet is an open-source MCP server
Brought to you by:
apralabs
Originally created by: kumaakh
Add support for Google Gemini as an alternative LLM backend for fleet members, alongside Claude.
Not all users have access to Claude or may prefer to use Gemini for certain workloads (cost, availability, capability mix). Supporting multiple LLM backends makes the fleet more accessible and flexible. Users should be able to mix and match — e.g., some members running Claude, others running Gemini — within the same fleet.
execute_prompt should route to the appropriate backend CLI/APIprovision_auth should support Gemini API key provisioning (GEMINI_API_KEY)update_claude tool needs generalization (update_agent_cli or similar)prompt-errors.ts) needs Gemini-specific patternsllmProvider field
Ticket changed by: kumaakh