Audience

Developers interested in a small language model

About TinyLlama

The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs. We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.

Pricing

Starting Price:
Free
Pricing Details:
Open source
Free Version:
Free Version available.

Integrations

No integrations listed.

Ratings/Reviews

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Company Information

TinyLlama
github.com/jzhang38/TinyLlama

Videos and Screen Captures

Get Started
You Might Also Like
HRSoft Compensation - Human Resources Software Icon
HRSoft Compensation - Human Resources Software

HRSoft is the only unified, purpose-built SaaS platform designed to transform your complex HR processes into seamless digital ones

Manage your enterprise’s compensation lifecycle and accurately recognize top performers with a digitized, integrated system. Keep employees invested and your HR team in control while preventing compensation chaos.

Product Details

Platforms Supported
Windows
Mac
Linux
Training
Documentation

TinyLlama Frequently Asked Questions

Q: What kinds of users and organization types does TinyLlama work with?
Q: What languages does TinyLlama support in their product?
Q: What type of training does TinyLlama provide?
Q: How much does TinyLlama cost?

TinyLlama Product Features

TinyLlama Additional Categories