Download Latest Version v5.2.0 - CrossEncoder multi-processing, multilingual NanoBEIR evaluators, similarity score in _mine_hard_negatives_, Transformers v5 support source code.tar.gz (13.3 MB)
Email in envelope

Get an email when there's a new version of SentenceTransformers

Home / v5.1.2
Name Modified Size InfoDownloads / Week
Parent folder
README.md 2025-10-22 4.9 kB
v5.1.2 - Sentence Transformers joins Hugging Face_ model saving_loading improvements and loss compatibility source code.tar.gz 2025-10-22 13.3 MB
v5.1.2 - Sentence Transformers joins Hugging Face_ model saving_loading improvements and loss compatibility source code.zip 2025-10-22 13.8 MB
Totals: 3 Items   27.0 MB 1

This patch celebrates the transition of Sentence Transformers to Hugging Face, and improves model saving, loading defaults, and loss compatibilities.

Install this version with

:::bash
# Training + Inference
pip install sentence-transformers[train]==5.1.2

# Inference only, use one of:
pip install sentence-transformers==5.1.2
pip install sentence-transformers[onnx-gpu]==5.1.2
pip install sentence-transformers[onnx]==5.1.2
pip install sentence-transformers[openvino]==5.1.2

Sentence Transformers is joining Hugging Face!

Today, Sentence Transformers is moving from the Ubiquitous Knowledge Processing (UKP) Lab at Technische Universität Darmstadt to Hugging Face. This formalizes the existing maintenance structure, as Tom Aarsen (that's me!) from Hugging Face has been maintaining the project for the past two years. The project's development roadmap, license, support, and commitment to the community remain unchanged. Read the full announcement for more details!

thumbnail

Minor changes re. saving and loading

  • Improve saving models with StaticEmbedding (#3524) and Dense (#3528) modules.
  • Fix training with CPU when "stronger" devices (CUDA, MPS) are available (#3525)
  • Default to 'xpu' device over 'cpu' if the former is available (#3537)

Minor changes re. losses

  • Change errors/warnings for MatryoshkaLoss to prevent easy-to-make mistakes, e.g. forgetting to use the original dimension (#3530)
  • Introduce compatibility between MSELoss and MatryoshkaLoss (#3538)
  • Also use mini-batches for positives with MegaBatchMarginLoss (#3550)

All Changes

New Contributors

Full Changelog: https://github.com/UKPLab/sentence-transformers/compare/v5.1.1...v5.1.2

Source: README.md, updated 2025-10-22