This repository contains the code and model weights for GPT-2, a large-scale unsupervised language model described in the OpenAI paper “Language Models are Unsupervised Multitask Learners.” The intent is to provide a starting point for researchers and engineers to experiment with GPT-2: generate text, fine‐tune on custom datasets, explore model behavior, or study its internal phenomena. The repository includes scripts for sampling, training, downloading pre-trained models, and utilities for tokenization and model handling.

Features

  • Pretrained model weights for multiple GPT-2 sizes (e.g. 117M, 345M, up to 1.5B parameters)
  • Sampling / generation scripts (conditional, unconditional, interactive)
  • Tokenizer and encoding / decoding utilities
  • Training / fine-tuning script support (for smaller models)
  • Support for memory-saving gradient techniques / optimizations during training
  • Utilities to download / manage model checkpoints via script

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow GPT-2

GPT-2 Web Site

nel_h2
MongoDB Atlas runs apps anywhere Icon
MongoDB Atlas runs apps anywhere

Deploy in 115+ regions with the modern database for every enterprise.

MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of GPT-2!

Additional Project Details

Programming Language

Python

Related Categories

Python Artificial Intelligence Software

Registered

4 days ago