Audience

Developers interested in a large language model

About GPT-NeoX

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

This repository records EleutherAI's library for training large-scale language models on GPUs. Our current framework is based on NVIDIA's Megatron Language Model and has been augmented with techniques from DeepSpeed as well as some novel optimizations. We aim to make this repo a centralized and accessible place to gather techniques for training large-scale autoregressive language models, and accelerate research into large-scale training.

Pricing

Starting Price:
Free
Pricing Details:
Open source
Free Version:
Free Version available.

Integrations

Ratings/Reviews

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Company Information

EleutherAI
Founded: 2020
github.com/EleutherAI/gpt-neox

Videos and Screen Captures

GPT-NeoX Screenshot 1
You Might Also Like
Passwordless authentication enables a secure and frictionless experience for your users | Auth0 Icon
Over two-thirds of people reuse passwords across sites, resulting in an increasingly insecure e-commerce ecosystem. Learn how passwordless can not only mitigate these issues but make the authentication experience delightful. Implement Auth0 in any application in just five minutes

Product Details

Platforms Supported
SaaS
On-Premises
Training
Documentation

GPT-NeoX Frequently Asked Questions

Q: What kinds of users and organization types does GPT-NeoX work with?
Q: What languages does GPT-NeoX support in their product?
Q: What other applications or services does GPT-NeoX integrate with?
Q: What type of training does GPT-NeoX provide?
Q: How much does GPT-NeoX cost?

GPT-NeoX Product Features