Chinese LLaMA-2 & Alpaca-2 Large Model Phase II Project
Chinese LLaMA & Alpaca large language model + local CPU/GPU training
Easy-to-use LLM fine-tuning framework (LLaMA-2, BLOOM, Falcon
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere
Train a 26M-parameter GPT from scratch in just 2h
Implementation of "Tree of Thoughts
Operating LLMs in production
The unofficial python package that returns response of Google Bard
An implementation of model parallel GPT-2 and GPT-3-style models