Showing 2 open source projects for "work manage"

View related business solutions
  • Easily Host LLMs and Web Apps on Cloud Run Icon
    Easily Host LLMs and Web Apps on Cloud Run

    Run everything from popular models with on-demand NVIDIA L4 GPUs to web apps without infrastructure management.

    Run frontend and backend services, batch jobs, host LLMs, and queue processing workloads without the need to manage infrastructure. Cloud Run gives you on-demand GPU access for hosting LLMs and running real-time AI—with 5-second cold starts and automatic scale-to-zero so you only pay for actual usage. New customers get $300 in free credit to start.
    Try Cloud Run Free
  • Ship AI Apps Faster with Vertex AI Icon
    Ship AI Apps Faster with Vertex AI

    Go from idea to deployed AI app without managing infrastructure. Vertex AI offers one platform for the entire AI development lifecycle.

    Ship AI apps and features faster with Vertex AI—your end-to-end AI platform. Access Gemini 3 and 200+ foundation models, fine-tune for your needs, and deploy with enterprise-grade MLOps. Build chatbots, agents, or custom models. New customers get $300 in free credit.
    Try Vertex AI Free
  • 1
    Gemini Fullstack LangGraph Quickstart

    Gemini Fullstack LangGraph Quickstart

    Get started w/ building Fullstack Agents using Gemini 2.5 & LangGraph

    ...The repository provides both a browser-based chat interface and a command-line script (cli_research.py) for executing research queries directly. For production deployment, the backend integrates with Redis and PostgreSQL to manage persistent memory, streaming outputs, & background task coordination.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 2
    Sandstorm

    Sandstorm

    One API call, pull Claude agent, completely sandboxed

    ...The core idea is to provide “one API call” access to a robust Claude agent loop that runs inside a secure sandbox, so you can upload files, connect tools, and run long-running tasks — all managed behind a simple REST-style interface that disappears when the work is done. This approach lowers the friction of building autonomous agents by removing the need to provision servers, orchestrate distributed agents, or manage persistent tooling; agents can be spun up in parallel without manual setup and shut down when complete. The sandbox environment isolates agent execution for security and predictability, and project updates continue to harden observability, fault handling, and configuration validation.
    Downloads: 1 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB