Showing 2 open source projects for "llm api"

View related business solutions
  • Deploy Apps in Seconds with Cloud Run Icon
    Deploy Apps in Seconds with Cloud Run

    Host and run your applications without the need to manage infrastructure. Scales up from and down to zero automatically.

    Cloud Run is the fastest way to deploy containerized apps. Push your code in Go, Python, Node.js, Java, or any language and Cloud Run builds and deploys it automatically. Get fast autoscaling, pay only when your code runs, and skip the infrastructure headaches. Two million requests free per month. And new customers get $300 in free credit.
    Try Cloud Run Free
  • Build AI Apps with Gemini 3 on Vertex AI Icon
    Build AI Apps with Gemini 3 on Vertex AI

    Access Google’s most capable multimodal models. Train, test, and deploy AI with 200+ foundation models on one platform.

    Vertex AI gives developers access to Gemini 3—Google’s most advanced reasoning and coding model—plus 200+ foundation models including Claude, Llama, and Gemma. Build generative AI apps with Vertex AI Studio, customize with fine-tuning, and deploy to production with enterprise-grade MLOps. New customers get $300 in free credits.
    Try Vertex AI Free
  • 1
    Kong

    Kong

    The Cloud-Native API Gateway

    Kong is a next generation cloud-native API platform for multi-cloud and hybrid organizations. When building for the web, mobile, or Internet of Things, you’ll need a common functionality to run your software, and Kong is that solution. Kong acts as a gateway, connecting microservices requests and APIs natively while also providing load balancing, logging, monitoring, authentication, rate-limiting, and so much more through plugins. Kong is highly extensible as well as platform agnostic,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    MCP Hub

    MCP Hub

    An MCP client for Neovim that seamlessly integrates MCP servers

    mcphub.nvim is an MCP (Model Context Protocol) client plugin for Neovim that seamlessly integrates MCP servers into your editing workflow with an intuitive interface for managing, testing, and using MCP servers with your favorite chat plugins. Create your first MCP capable agent you need only 6 lines of code. Works with any langchain-supported LLM that supports tool calling (OpenAI, Anthropic, Groq, LLama etc.) Explore MCP capabilities and generate starter code with the interactive code...
    Downloads: 3 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB