char-rnn is a classic codebase for training multi-layer recurrent neural networks on raw text to build character-level language models that learn to predict the next character in a sequence. It supports common recurrent architectures including vanilla RNNs as well as LSTM and GRU variants, letting users compare behavior and output quality across model types. It is straightforward: you provide a single text file, train the model to minimize next-character prediction loss, then sample from the trained network to generate new text one character at a time in the style of the dataset. The project is designed for experimentation, offering tunable settings for depth, hidden size, dropout, sequence length, and sampling temperature to control creativity and coherence. It is frequently used as a learning project for understanding sequence modeling, recurrent training dynamics, and the practical details of text generation.

Features

  • Character-level language model training from a single text file
  • Multi-layer RNN, LSTM, and GRU architecture options
  • Text sampling and generation with controllable randomness
  • Configurable model depth, hidden size, and regularization
  • Training workflow designed for quick experimentation
  • Classic reference implementation for sequence modeling

Project Samples

Project Activity

See All Activity >

Follow char-rnn

char-rnn Web Site

Other Useful Business Software
MongoDB Atlas runs apps anywhere Icon
MongoDB Atlas runs apps anywhere

Deploy in 115+ regions with the modern database for every enterprise.

MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of char-rnn!

Additional Project Details

Programming Language

Lua

Related Categories

Lua Neural Network Libraries

Registered

2026-03-02