MicroStackCanonical
|
NVIDIA Magnum IONVIDIA
|
|||||
Related Products
|
||||||
About
Install and run OpenStack on Linux in minutes. Made for developers and great for edge, IoT, and appliances. A full OpenStack in a single snap package. MicroStack is an upstream multi-node OpenStack deployment which can run directly on your workstation. Although made for developers, it is also suitable for edge, IoT and appliances. Grab MicroStack from the Snap Store and get your OpenStack running right away. Get a full OpenStack system running in minutes. Runs safely on your laptop with state of the art isolation. Pure upstream OpenStack delivered to your laptop. Includes all key OpenStack components: Keystone, Nova, Neutron, Glance, and Cinder. All the cool things you probably want to try on a small, standard OpenStack are all built-in. Use MicroStack in your CI/CD pipelines and get on with your day without headaches. MicroStack requires at least 8 GB RAM and a multi-core processor.
|
About
NVIDIA Magnum IO is the architecture for parallel, intelligent data center I/O. It maximizes storage, network, and multi-node, multi-GPU communications for the world’s most important applications, using large language models, recommender systems, imaging, simulation, and scientific research. Magnum IO utilizes storage I/O, network I/O, in-network compute, and I/O management to simplify and speed up data movement, access, and management for multi-GPU, multi-node systems. It supports NVIDIA CUDA-X libraries and makes the best use of a range of NVIDIA GPU and networking hardware topologies to achieve optimal throughput and low latency. In multi-GPU, multi-node systems, slow CPU, single-thread performance is in the critical path of data access from local or remote storage devices. With storage I/O acceleration, the GPU bypasses the CPU and system memory, and accesses remote storage via 8x 200 Gb/s NICs, achieving up to 1.6 TB/s of raw storage bandwidth.
|
|||||
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
|||||
Audience
Developers and IT teams interested in an upstream multi-node OpenStack deployment application for edge, IoT, and appliances
|
Audience
AI researchers, data scientists, and HPC developers needing a tool to eliminate I/O bottlenecks in multi-GPU, multi-node environments
|
|||||
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
|||||
API
Offers API
|
API
Offers API
|
|||||
Screenshots and Videos |
Screenshots and Videos |
|||||
Pricing
No information available.
Free Version
Free Trial
|
Pricing
No information available.
Free Version
Free Trial
|
|||||
Reviews/
|
Reviews/
|
|||||
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
|||||
Company InformationCanonical
Founded: 2010
United Kingdom
microstack.run/
|
Company InformationNVIDIA
Founded: 1993
United States
www.nvidia.com/en-us/data-center/magnum-io/
|
|||||
Alternatives |
Alternatives |
|||||
|
|
||||||
|
|
|
|||||
|
|
||||||
|
|
|
|||||
Categories |
Categories |
|||||
Integrations
Airstack
Apache Spark
Chef
Cisco AnyConnect
Google Cloud Platform
IBM Cloud
Keystone
Kubernetes
NVIDIA NetQ
NVIDIA virtual GPU
|
Integrations
Airstack
Apache Spark
Chef
Cisco AnyConnect
Google Cloud Platform
IBM Cloud
Keystone
Kubernetes
NVIDIA NetQ
NVIDIA virtual GPU
|
|||||
|
|
|