An implementation of model & data parallel GPT3-like models using the mesh-tensorflow library. If you're just here to play with our pre-trained models, we strongly recommend you try out the HuggingFace Transformer integration. Training and inference is officially supported on TPU and should work on GPU as well. This repository will be (mostly) archived as we move focus to our GPU-specific repo, GPT-NeoX. NB, while neo can technically run a training step at 200B+ parameters, it is very inefficient at those scales. This, as well as the fact that many GPUs became available to us, among other things, prompted us to move development over to GPT-NeoX. All evaluations were done using our evaluation harness. Some results for GPT-2 and GPT-3 are inconsistent with the values reported in the respective papers. We are currently looking into why, and would greatly appreciate feedback and further testing of our eval harness.

Features

  • Sign up for Google Cloud Platform, and create a storage bucket
  • You can also choose to train GPTNeo locally on your GPUs
  • Download one of our pre-trained models
  • Generating text is as simple as running the main.py script
  • Create your Tokenizer
  • Tokenize your dataset

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow GPT Neo

GPT Neo Web Site

Other Useful Business Software
Go From AI Idea to AI App Fast Icon
Go From AI Idea to AI App Fast

One platform to build, fine-tune, and deploy ML models. No MLOps team required.

Access Gemini 3 and 200+ models. Build chatbots, agents, or custom models with built-in monitoring and scaling.
Try Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of GPT Neo!

Additional Project Details

Programming Language

Python

Related Categories

Python Large Language Models (LLM), Python ChatGPT Apps, Python Generative AI, Python AI Models

Registered

2023-03-23