With just 2 clicks (not including Colab auth process), the 1.5B pretrained Chinese model demo is ready to go. The contents in this repository are for academic research purpose, and we do not provide any conclusive remarks. Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC) Simplifed GPT2 train scripts(based on Grover, supporting TPUs). Ported bert tokenizer, multilingual corpus compatible. 1.5B GPT2 pretrained Chinese model (~15G corpus, 10w steps). Batteries-included Colab demo. 1.5B GPT2 pretrained Chinese model (~30G corpus, 22w steps).

Features

  • Simplifed GPT2 train scripts(based on Grover, supporting TPUs)
  • Ported bert tokenizer, multilingual corpus compatible
  • 1.5B GPT2 pretrained Chinese model ( ~15G corpus, 10w steps )
  • Batteries-included Colab demo
  • 1.5B GPT2 pretrained Chinese model ( ~30G corpus, 22w steps )
  • Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC)

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow GPT2 for Multiple Languages

GPT2 for Multiple Languages Web Site

You Might Also Like
SKUDONET Open Source Load Balancer Icon
SKUDONET Open Source Load Balancer

Take advantage of Open Source Load Balancer to elevate your business security and IT infrastructure with a custom ADC Solution.

SKUDONET ADC, operates at the application layer, efficiently distributing network load and application load across multiple servers. This not only enhances the performance of your application but also ensures that your web servers can handle more traffic seamlessly.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of GPT2 for Multiple Languages!

Additional Project Details

Programming Language

Python

Related Categories

Python AI Text Generators, Python Generative AI

Registered

2023-03-23