GLM-130B
GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)
GLM-130B is an open bilingual (English and Chinese) dense language model with 130 billion parameters, released by the Tsinghua KEG Lab and collaborators as part of the General Language Model (GLM) series. It is designed for large-scale inference and supports both left-to-right generation and blank filling, making it versatile across NLP tasks. Trained on over 400 billion tokens (200B English, 200B Chinese), it achieves performance surpassing GPT-3 175B, OPT-175B, and BLOOM-176B on multiple...