An OpenAI-compatible chat pane for Spyder 6.x.
Supports OpenAI, Ollama, LM Studio, and any other server that exposes an OpenAI-compatible /v1/chat/completions endpoint.
Installation with PyPi in the same environment as Spyder IDE:
(spyder) $ pip install spyder-ai-chat
Or from source:
# clone / download and unzip the project source code, then:
(spyder) $ cd spyder_ai_chat
(spyder) $ pip install -e .
Features
- Chat UI | Scrollable conversation history with colour-coded user / assistant bubbles
- Streaming | Token-by-token streaming so you see the reply as it's generated
- System prompt | Optional system prompt field to set the assistant's persona
- Model selector | Editable combo-box — switch models without leaving Spyder
- Configurable URL | Works with OpenAI cloud, Ollama, LM Studio, etc.
- Optional API key | Leave blank for local models that don't require authentication
- Stop button | Cancel streaming mid-reply
- Clear | Reset conversation history with one click
- Editor auto-complete fill-in-the-middle (FIM) based on local or clould LLMs
- Full autonomous agentic mode allows LLM to read, edit, delete files and directories (optional confirmation), fully integrated with SpyderIDE editor.
- Token utilization statistics window
- Git bar with LLM git commands support
License
MIT LicenseFollow Spyder AI Chat Plugin
Other Useful Business Software
Stop Storing Third-Party Tokens in Your Database
Rolling your own OAuth token storage can be a security liability. Token Vault securely stores access and refresh tokens from federated providers and handles exchange and renewal automatically. Connected accounts, refresh exchange, and privileged worker flows included.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of Spyder AI Chat Plugin!