+
+

Related Products

  • RunPod
    206 Ratings
    Visit Website
  • LM-Kit.NET
    28 Ratings
    Visit Website
  • Dragonfly
    16 Ratings
    Visit Website
  • OpenMetal
    39 Ratings
    Visit Website
  • Google Cloud Platform
    60,933 Ratings
    Visit Website
  • InMotion Hosting
    2,918 Ratings
    Visit Website
  • PackageX OCR Scanning
    46 Ratings
    Visit Website
  • Google Compute Engine
    1,168 Ratings
    Visit Website
  • Flowspace
    317 Ratings
    Visit Website
  • TelemetryTV
    279 Ratings
    Visit Website

About

Amazon Elastic Inference allows you to attach low-cost GPU-powered acceleration to Amazon EC2 and Sagemaker instances or Amazon ECS tasks, to reduce the cost of running deep learning inference by up to 75%. Amazon Elastic Inference supports TensorFlow, Apache MXNet, PyTorch and ONNX models. Inference is the process of making predictions using a trained model. In deep learning applications, inference accounts for up to 90% of total operational costs for two reasons. Firstly, standalone GPU instances are typically designed for model training - not for inference. While training jobs batch process hundreds of data samples in parallel, inference jobs usually process a single input in real time, and thus consume a small amount of GPU compute. This makes standalone GPU inference cost-inefficient. On the other hand, standalone CPU instances are not specialized for matrix operations, and thus are often too slow for deep learning inference.

About

NVIDIA DGX Cloud Serverless Inference is a high-performance, serverless AI inference solution that accelerates AI innovation with auto-scaling, cost-efficient GPU utilization, multi-cloud flexibility, and seamless scalability. With NVIDIA DGX Cloud Serverless Inference, you can scale down to zero instances during periods of inactivity to optimize resource utilization and reduce costs. There's no extra cost for cold-boot start times, and the system is optimized to minimize them. NVIDIA DGX Cloud Serverless Inference is powered by NVIDIA Cloud Functions (NVCF), which offers robust observability features. It allows you to integrate your preferred monitoring tools, such as Splunk, for comprehensive insights into your AI workloads. NVCF offers flexible deployment options for NIM microservices while allowing you to bring your own containers, models, and Helm charts.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

IT teams that need an advanced Infrastructure as a Service solution

Audience

Enterprises requiring a solution for deploying AI inference workloads across multi-cloud environments without the complexity of managing underlying infrastructure

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

No information available.
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

Amazon
Founded: 2006
United States
aws.amazon.com/machine-learning/elastic-inference/

Company Information

NVIDIA
Founded: 1993
United States
developer.nvidia.com/dgx-cloud/serverless-inference

Alternatives

Alternatives

AWS Neuron

AWS Neuron

Amazon Web Services

Categories

Categories

Integrations

Amazon Web Services (AWS)
Amazon EC2
Amazon EC2 G4 Instances
CoreWeave
Google Cloud Platform
Helm
Llama
MXNet
Microsoft Azure
NVIDIA AI Foundations
NVIDIA Cloud Functions
NVIDIA DGX Cloud
NVIDIA NIM
Nebius
Oracle Cloud Infrastructure
PyTorch
Splunk Cloud Platform
TensorFlow
Yotta

Integrations

Amazon Web Services (AWS)
Amazon EC2
Amazon EC2 G4 Instances
CoreWeave
Google Cloud Platform
Helm
Llama
MXNet
Microsoft Azure
NVIDIA AI Foundations
NVIDIA Cloud Functions
NVIDIA DGX Cloud
NVIDIA NIM
Nebius
Oracle Cloud Infrastructure
PyTorch
Splunk Cloud Platform
TensorFlow
Yotta
Claim Amazon Elastic Inference and update features and information
Claim Amazon Elastic Inference and update features and information
Claim NVIDIA DGX Cloud Serverless Inference and update features and information
Claim NVIDIA DGX Cloud Serverless Inference and update features and information