Browse free open source Python MCP Gateways and projects below. Use the toggles on the left to filter open source Python MCP Gateways by OS, license, language, programming language, and project status.

  • The All-in-One Commerce Platform for Businesses - Shopify Icon
    The All-in-One Commerce Platform for Businesses - Shopify

    Shopify offers plans for anyone that wants to sell products online and build an ecommerce store, small to mid-sized businesses as well as enterprise

    Shopify is a leading all-in-one commerce platform that enables businesses to start, build, and grow their online and physical stores. It offers tools to create customized websites, manage inventory, process payments, and sell across multiple channels including online, in-person, wholesale, and global markets. The platform includes integrated marketing tools, analytics, and customer engagement features to help merchants reach and retain customers. Shopify supports thousands of third-party apps and offers developer-friendly APIs for custom solutions. With world-class checkout technology, Shopify powers over 150 million high-intent shoppers worldwide. Its reliable, scalable infrastructure ensures fast performance and seamless operations at any business size.
    Learn More
  • Simple, Secure Domain Registration Icon
    Simple, Secure Domain Registration

    Get your domain at wholesale price. Cloudflare offers simple, secure registration with no markups, plus free DNS, CDN, and SSL integration.

    Register or renew your domain and pay only what we pay. No markups, hidden fees, or surprise add-ons. Choose from over 400 TLDs (.com, .ai, .dev). Every domain is integrated with Cloudflare's industry-leading DNS, CDN, and free SSL to make your site faster and more secure. Simple, secure, at-cost domain registration.
    Sign up for free
  • 1
    Agent Stack

    Agent Stack

    Deploy and share agents with open infrastructure

    Agent Stack is an open infrastructure platform designed to take AI agents from prototype to production, no matter how they were built. It includes a runtime environment, multi-tenant web UI, catalog of agents, and deployment flow that seeks to remove vendor lock-in and provide greater autonomy. Under the hood it’s built on the “Agent2Agent” (A2A) protocol, enabling interoperability between different agent ecosystems, runtime services, and frameworks. The platform supports agents built in frameworks like LangChain, CrewAI, etc., enabling them to be hosted, managed and shared through a unified interface. It also offers multi-model, multi-provider support (OpenAI, Anthropic, Gemini, IBM WatsonX, Ollama etc.), letting users compare performance and cost across models. For developers and organizations building AI-agent products or automations, Agent Stack gives a scaffold that handles the “plumbing”, so they can focus on logic and domain.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    ContextForge MCP Gateway

    ContextForge MCP Gateway

    A Model Context Protocol (MCP) Gateway & Registry

    MCP Context Forge is a feature-rich gateway and registry that federates Model Context Protocol (MCP) servers and traditional REST services behind a single, governed endpoint. It exposes an MCP-compliant interface to clients while handling discovery, authentication, rate limiting, retries, and observability on the server side. The gateway scales horizontally, supports multi-cluster deployments on Kubernetes, and uses Redis for federation and caching across instances. Operators can define virtual servers, wire multiple transports, and optionally enable an admin UI for management and monitoring. Packaged for quick starts via PyPI and Docker, it targets production reliability with health checks, metrics, and structured logs. The project positions itself as an integration hub so agentic apps can “connect once, use many” backends with consistent policy and lifecycle control.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    LiteLLM

    LiteLLM

    lightweight package to simplify LLM API calls

    Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, Azure OpenAI etc.] liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, and Huggingface models.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next