Audience

Developers interested in a large language model

About GPT-NeoX

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

This repository records EleutherAI's library for training large-scale language models on GPUs. Our current framework is based on NVIDIA's Megatron Language Model and has been augmented with techniques from DeepSpeed as well as some novel optimizations. We aim to make this repo a centralized and accessible place to gather techniques for training large-scale autoregressive language models, and accelerate research into large-scale training.

Pricing

Starting Price:
Free
Pricing Details:
Open source
Free Version:
Free Version available.

Integrations

Ratings/Reviews

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Company Information

EleutherAI
Founded: 2020
github.com/EleutherAI/gpt-neox

Videos and Screen Captures

GPT-NeoX Screenshot 1
Other Useful Business Software
Easy-to-Use Website Accessibility Widget Icon
Easy-to-Use Website Accessibility Widget

An accessibility solution for quick website accessibility improvement.

All in One Accessibility is an AI based accessibility tool that helps organizations to enhance the accessibility and usability of websites quickly.
Learn More

Product Details

Platforms Supported
Cloud
On-Premises
Training
Documentation

GPT-NeoX Frequently Asked Questions

Q: What kinds of users and organization types does GPT-NeoX work with?
Q: What languages does GPT-NeoX support in their product?
Q: What other applications or services does GPT-NeoX integrate with?
Q: What type of training does GPT-NeoX provide?
Q: How much does GPT-NeoX cost?

GPT-NeoX Product Features