XLM (Cross-lingual Language Model) is a family of multilingual pretraining methods that align representations across languages to enable strong zero-shot transfer. It popularized objectives like Masked Language Modeling (MLM) across many languages and Translation Language Modeling (TLM) that jointly trains on parallel sentence pairs to tighten cross-lingual alignment. Using a shared subword vocabulary, XLM learns language-agnostic features that work well for classification and sequence labeling tasks such as XNLI, NER, and POS without target-language supervision. The repository provides preprocessing pipelines, training code, and fine-tuning scripts so you can reproduce benchmark results or adapt models to your own multilingual corpora. Pretrained checkpoints cover dozens of languages and multiple model sizes, balancing quality and compute needs.

Features

  • Multilingual pretraining with MLM and translation-aware TLM objectives
  • Shared subword vocabulary for cross-language alignment
  • Strong zero-shot transfer on XNLI, NER, POS, and related tasks
  • End-to-end scripts for preprocessing, training, and fine-tuning
  • Pretrained models across many languages and sizes
  • Tools for evaluation and adaptation to custom multilingual data

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow XLM (Cross-lingual Language Model)

XLM (Cross-lingual Language Model) Web Site

Other Useful Business Software
Earn up to 16% annual interest with Nexo. Icon
Earn up to 16% annual interest with Nexo.

More flexibility. More control.

Generate interest, access liquidity without selling, and execute trades seamlessly. All in one platform. Geographic restrictions, eligibility, and terms apply.
Get started with Nexo.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of XLM (Cross-lingual Language Model)!

Additional Project Details

Programming Language

Python

Related Categories

Python Natural Language Processing (NLP) Tool

Registered

2025-10-07