LLM Gateways for Windows

View 4 business solutions

Browse free open source LLM Gateways and projects for Windows below. Use the toggles on the left to filter open source LLM Gateways by OS, license, language, programming language, and project status.

  • MyQ Print Management Software Icon
    MyQ Print Management Software

    SAVE TIME WITH PERSONALIZED PRINT SOLUTIONS

    Boost your digital or traditional workplace with MyQ’s secure print and scan solutions that respect your time and help you focus on what you do best.
    Learn More
  • Vibes don’t ship, Retool does Icon
    Vibes don’t ship, Retool does

    Start from a prompt and build production-ready apps on your data—with security, permissions, and compliance built in.

    Vibe coding tools create cool demos, but Retool helps you build software your company can actually use. Generate internal apps that connect directly to your data—deployed in your cloud with enterprise security from day one. Build dashboards, admin panels, and workflows with granular permissions already in place. Stop prototyping and ship on a platform that actually passes security review.
    Build apps that ship
  • 1
    Bifrost

    Bifrost

    The Fastest LLM Gateway with built in OTel observability

    Bifrost is an LLM gateway designed to provide a unified OpenAI-compatible API front for many different model providers. It abstracts away the complexity of working directly with multiple backend providers (OpenAI, Anthropic, AWS Bedrock, Google Vertex, etc.), enabling you to plug in providers and switch between them without touching your client code. It is built to be high performance: in benchmark tests at 5,000 requests per second, it reportedly adds only microseconds of overhead and achieves perfect success rates with no failed requests. Bifrost supports features such as automatic fallback (failover between providers), load balancing across API keys/providers, and semantic caching to reduce latency and cost. It also includes observability with built-in metrics, tracing, logging, and supports governance features like rate limiting, access control, and cost budgeting. The architecture is modular: there is a core engine, plugin layers, and transport layers (HTTP APIs).
    Downloads: 3 This Week
    Last Update:
    See Project
  • 2
    Kong

    Kong

    The Cloud-Native API Gateway

    Kong is a next generation cloud-native API platform for multi-cloud and hybrid organizations. When building for the web, mobile, or Internet of Things, you’ll need a common functionality to run your software, and Kong is that solution. Kong acts as a gateway, connecting microservices requests and APIs natively while also providing load balancing, logging, monitoring, authentication, rate-limiting, and so much more through plugins. Kong is highly extensible as well as platform agnostic, connecting APIs across different environments, platforms and patterns. Achieve architectural freedom with Kong today.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 3
    MagicAPI AI Gateway

    MagicAPI AI Gateway

    Built for demanding AI workflows

    The world's fastest AI Gateway proxy, written in Rust and optimized for maximum performance. This high-performance API gateway routes requests to various AI providers (OpenAI, GROQ) with streaming support, making it perfect for developers who need reliable and blazing-fast AI API access.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4
    APIPark

    APIPark

    APIPark is the #1 open-source AI Gateway and Developer Portal

    APIPark is an open-source, all-in-one AI gateway and API developer portal, that helps developers and enterprises easily manage, integrate, and deploy AI services. No matter which AI model you use, APIPark provides a one-stop integration solution. It unifies the management of all authentication information and tracks the costs of API calls. Standardize the request data format for all AI models. When switching AI models or modifying prompts, it won’t affect your app or microservices, simplifying your AI usage and reducing maintenance costs. You can quickly combine AI models and prompts into new APIs. For example, using OpenAI GPT-4 and custom prompts, you can create sentiment analysis APIs, translation APIs, or data analysis APIs. API lifecycle management helps standardize the process of managing APIs, including traffic forwarding, load balancing, and managing different versions of publicly accessible APIs. This improves API quality and maintainability.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Find Hidden Risks in Windows Task Scheduler Icon
    Find Hidden Risks in Windows Task Scheduler

    Free diagnostic script reveals configuration issues, error patterns, and security risks. Instant HTML report.

    Windows Task Scheduler might be hiding critical failures. Download the free JAMS diagnostic tool to uncover problems before they impact production—get a color-coded risk report with clear remediation steps in minutes.
    Download Free Tool
  • 5
    Chat Nio

    Chat Nio

    Next Generation AI One-Stop Internationalization Solution

    Chat Nio is described as a next-generation, all-in-one AI platform that serves as an end-to-end solution for both B2B and B2C use cases. It supports dozens of underlying AI providers (OpenAI, Claude, Stable Diffusion, DALL·E, Midjourney, and many Chinese models, etc.), giving users flexibility in backend selection and switching. It offers a full stack: model management, channel/provider integration, a model marketplace, caching, subscription and billing support, dashboard analytics, and a web/admin UI. The platform supports model caching so repeated queries or similar inputs may be accelerated, and has mechanisms for elastic billing/subscription models to monetize usage. CoAI also supports “cloud sync,” allowing user settings or data to sync across deployments, and “preset” configurations to streamline user experiences.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    LangDB AI Gateway

    LangDB AI Gateway

    Govern, secure, and optimize your AI traffic

    AI Gateway is a high-performance, open-source API gateway optimized for managing and monitoring LLM traffic at scale. Developed by the LangDB team, AI Gateway acts as an intermediary between clients and backend LLMs, providing advanced features like caching, rate limiting, prompt management, and observability. It helps teams secure and optimize their LLM deployments, whether using local models or external APIs like OpenAI or Anthropic. With native support for multi-tenant environments and low-latency inference routing, AI Gateway is an essential tool for companies building production-grade generative AI services.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Portkey AI Gateway

    Portkey AI Gateway

    A blazing fast AI Gateway with integrated guardrails

    Portkey AI Gateway aims to offer a blazing fast, secure, and flexible gateway for interacting with a wide variety of models and enforcing guardrails. It presents a single, friendly API through which you can route to 200+ LLMs, while applying configurable input/output guardrails to enforce policies or restrict certain content. It supports automatic retries, fallbacks, load balancing across providers or keys, and request timeouts to avoid latency spikes. The gateway is multimodal: it can handle text, vision, audio, and image models under a common interface. It also offers features for governance: role-based access, compliance with standards (SOC2, HIPAA, GDPR), secure key management, and logging/analytics of usage, latency, errors, and cost. The system integrates with agent frameworks like LangChain, Autogen, and others, enabling the building of more complex AI applications. It’s lightweight and optimized for low latency with a small footprint.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next