Our light reimplementation of bert for keras. A cleaner, lighter version of bert for keras. This is the keras version of the transformer model library re-implemented by the author and is committed to combining transformer and keras with as clean code as possible. The original intention of this project is for the convenience of modification and customization, so it may be updated frequently. Load the pre-trained weights of bert/roberta/albert for fine-tune. Implement the attention mask required by the language model and seq2seq. Pre-training code from zero (supports TPU, multi-GPU, please see pertaining). Compatible with keras, tf.keras.

Features

  • Compatible with keras, tf.keras
  • Rich examples
  • Load the pre-trained weights of bert/roberta/albert for finetune
  • Implement the attention mask required by the language model and seq2seq
  • Pre-training code from zero
  • Support TPU, multi-GPU, please see pretraining

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow bert4keras

bert4keras Web Site

Other Useful Business Software
Go from Code to Production URL in Seconds Icon
Go from Code to Production URL in Seconds

Cloud Run deploys apps in any language instantly. Scales to zero. Pay only when code runs.

Skip the Kubernetes configs. Cloud Run handles HTTPS, scaling, and infrastructure automatically. Two million requests free per month.
Try it free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of bert4keras!

Additional Project Details

Programming Language

Python

Related Categories

Python Large Language Models (LLM), Python Generative AI

Registered

2023-03-24