7 Integrations with NVIDIA Cloud Functions

View a list of NVIDIA Cloud Functions integrations and software that integrates with NVIDIA Cloud Functions below. Compare the best NVIDIA Cloud Functions integrations as well as features, ratings, user reviews, and pricing of software that integrates with NVIDIA Cloud Functions. Here are the current NVIDIA Cloud Functions integrations in 2026:

  • 1
    Grafana

    Grafana

    Grafana Labs

    Grafana Labs provides an open and composable observability stack built around Grafana, the leading open source technology for dashboards and visualization. Recognized as a 2025 Gartner® Magic Quadrant™ Leader for Observability Platforms and positioned furthest to the right for Completeness of Vision, Grafana Labs supports over 25M users and 5,000+ customers—including Bloomberg, Citigroup, Dell Technologies, Salesforce, and TomTom. The LGTM Stack combines Grafana for visualization, Mimir for metrics, Loki for logs, and Tempo for traces. Grafana Cloud, the fully managed offering, accelerates time to value with turnkey solutions for Kubernetes monitoring, incident response, load testing, and more. It features Adaptive Metrics for cost-efficient data aggregation and native OpenTelemetry support. Built on open standards, Grafana empowers teams to visualize and correlate data from any source—without vendor lock-in—whether self-managed or in the cloud. Grafana Cloud scales with you, securely.
    Starting Price: $0
    View Software
    Visit Website
  • 2
    Docker

    Docker

    Docker

    Docker takes away repetitive, mundane configuration tasks and is used throughout the development lifecycle for fast, easy and portable application development, desktop and cloud. Docker’s comprehensive end-to-end platform includes UIs, CLIs, APIs and security that are engineered to work together across the entire application delivery lifecycle. Get a head start on your coding by leveraging Docker images to efficiently develop your own unique applications on Windows and Mac. Create your multi-container application using Docker Compose. Integrate with your favorite tools throughout your development pipeline, Docker works with all development tools you use including VS Code, CircleCI and GitHub. Package applications as portable container images to run in any environment consistently from on-premises Kubernetes to AWS ECS, Azure ACI, Google GKE and more. Leverage Docker Trusted Content, including Docker Official Images and images from Docker Verified Publishers.
    Starting Price: $7 per month
  • 3
    Kubernetes

    Kubernetes

    Kubernetes

    Kubernetes (K8s) is an open-source system for automating deployment, scaling, and management of containerized applications. It groups containers that make up an application into logical units for easy management and discovery. Kubernetes builds upon 15 years of experience of running production workloads at Google, combined with best-of-breed ideas and practices from the community. Designed on the same principles that allows Google to run billions of containers a week, Kubernetes can scale without increasing your ops team. Whether testing locally or running a global enterprise, Kubernetes flexibility grows with you to deliver your applications consistently and easily no matter how complex your need is. Kubernetes is open source giving you the freedom to take advantage of on-premises, hybrid, or public cloud infrastructure, letting you effortlessly move workloads to where it matters to you.
    Starting Price: Free
  • 4
    Datadog

    Datadog

    Datadog

    Datadog is the monitoring, security and analytics platform for developers, IT operations teams, security engineers and business users in the cloud age. Our SaaS platform integrates and automates infrastructure monitoring, application performance monitoring and log management to provide unified, real-time observability of our customers' entire technology stack. Datadog is used by organizations of all sizes and across a wide range of industries to enable digital transformation and cloud migration, drive collaboration among development, operations, security and business teams, accelerate time to market for applications, reduce time to problem resolution, secure applications and infrastructure, understand user behavior and track key business metrics.
    Leader badge
    Starting Price: $15.00/host/month
  • 5
    Helm

    Helm

    Helm

    Helm runs in GNU/Linux, Mac OSX and Windows. Run Helm as a standalone synthesizer or as an LV2, VST, VST3 or AU plugin. Comes in both 32-bit and 64-bit versions. This means you are free to run Helm anywhere without the pains of DRM, you can study and change the source code and redistribute exact or modified copies of Helm. Helm is a software synthesizer. You use it to create electronic music on your computer. Helm is free as in freedom. This means you control this software, it doesn't control you. In terms of money, Helm is pay what you want. So you are free to pay nothing. Any sound that comes out of Helm belongs to the person who played it. You are the copyright holder to any sound you create with Helm. You can turn some modules on and of. They have little power buttons in the top left that you can click to turn them on or of. The SUB module is one of the three sound producers in Helm. It controls a single oscillator that by default plays one octave below the currently played note.
  • 6
    NVIDIA DGX Cloud
    NVIDIA DGX Cloud offers a fully managed, end-to-end AI platform that leverages the power of NVIDIA’s advanced hardware and cloud computing services. This platform allows businesses and organizations to scale AI workloads seamlessly, providing tools for machine learning, deep learning, and high-performance computing (HPC). DGX Cloud integrates seamlessly with leading cloud providers, delivering the performance and flexibility required to handle the most demanding AI applications. This service is ideal for businesses looking to enhance their AI capabilities without the need to manage physical infrastructure.
  • 7
    NVIDIA DGX Cloud Serverless Inference
    NVIDIA DGX Cloud Serverless Inference is a high-performance, serverless AI inference solution that accelerates AI innovation with auto-scaling, cost-efficient GPU utilization, multi-cloud flexibility, and seamless scalability. With NVIDIA DGX Cloud Serverless Inference, you can scale down to zero instances during periods of inactivity to optimize resource utilization and reduce costs. There's no extra cost for cold-boot start times, and the system is optimized to minimize them. NVIDIA DGX Cloud Serverless Inference is powered by NVIDIA Cloud Functions (NVCF), which offers robust observability features. It allows you to integrate your preferred monitoring tools, such as Splunk, for comprehensive insights into your AI workloads. NVCF offers flexible deployment options for NIM microservices while allowing you to bring your own containers, models, and Helm charts.
  • Previous
  • You're on page 1
  • Next