dLLM is an open-source framework designed to simplify the development, training, and evaluation of diffusion-based large language models. Unlike traditional autoregressive models that generate text sequentially token by token, diffusion language models generate text through an iterative denoising process that refines masked tokens over multiple steps. This approach allows models to reason over the entire sequence simultaneously and potentially produce more coherent outputs with bidirectional context. The project provides an integrated pipeline that standardizes how diffusion language models are trained, evaluated, and deployed, helping researchers reproduce experiments and compare results more easily. The framework includes scalable training infrastructure inspired by modern deep learning toolkits and supports integrations with widely used libraries for distributed training.

Features

  • Unified framework for training, inference, and evaluation of diffusion language models
  • Support for distributed training frameworks such as DeepSpeed and FSDP
  • Integration with modern machine learning libraries and training pipelines
  • Tools for experimenting with diffusion-based text generation architectures
  • Reproducible workflows for benchmarking diffusion language models
  • Example implementations demonstrating diffusion-based chat and text generation

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow dLLM

dLLM Web Site

Other Useful Business Software
Try Google Cloud Risk-Free With $300 in Credit Icon
Try Google Cloud Risk-Free With $300 in Credit

No hidden charges. No surprise bills. Cancel anytime.

Use your credit across every product. Compute, storage, AI, analytics. When it runs out, 20+ products stay free. You only pay when you choose to.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of dLLM!

Additional Project Details

Programming Language

TypeScript

Related Categories

TypeScript Large Language Models (LLM)

Registered

2026-03-06