Showing 3 open source projects for "llm api"

View related business solutions
  • Easily Host LLMs and Web Apps on Cloud Run Icon
    Easily Host LLMs and Web Apps on Cloud Run

    Run everything from popular models with on-demand NVIDIA L4 GPUs to web apps without infrastructure management.

    Run frontend and backend services, batch jobs, host LLMs, and queue processing workloads without the need to manage infrastructure. Cloud Run gives you on-demand GPU access for hosting LLMs and running real-time AI—with 5-second cold starts and automatic scale-to-zero so you only pay for actual usage. New customers get $300 in free credit to start.
    Try Cloud Run Free
  • Cut Cloud Costs with Google Compute Engine Icon
    Cut Cloud Costs with Google Compute Engine

    Save up to 91% with Spot VMs and get automatic sustained-use discounts. One free VM per month, plus $300 in credits.

    Save on compute costs with Compute Engine. Reduce your batch jobs and workload bill 60-91% with Spot VMs. Compute Engine's committed use offers customers up to 70% savings through sustained use discounts. Plus, you get one free e2-micro VM monthly and $300 credit to start.
    Try Compute Engine
  • 1
    MagicAPI AI Gateway

    MagicAPI AI Gateway

    Built for demanding AI workflows

    The world's fastest AI Gateway proxy, written in Rust and optimized for maximum performance. This high-performance API gateway routes requests to various AI providers (OpenAI, GROQ) with streaming support, making it perfect for developers who need reliable and blazing-fast AI API access.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    APIPark

    APIPark

    APIPark is the #1 open-source AI Gateway and Developer Portal

    APIPark is an open-source, all-in-one AI gateway and API developer portal, that helps developers and enterprises easily manage, integrate, and deploy AI services. No matter which AI model you use, APIPark provides a one-stop integration solution. It unifies the management of all authentication information and tracks the costs of API calls. Standardize the request data format for all AI models. When switching AI models or modifying prompts, it won’t affect your app or microservices,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Kong

    Kong

    The Cloud-Native API Gateway

    Kong is a next generation cloud-native API platform for multi-cloud and hybrid organizations. When building for the web, mobile, or Internet of Things, you’ll need a common functionality to run your software, and Kong is that solution. Kong acts as a gateway, connecting microservices requests and APIs natively while also providing load balancing, logging, monitoring, authentication, rate-limiting, and so much more through plugins. Kong is highly extensible as well as platform agnostic,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB