TorchRec is a PyTorch domain library built to provide common sparsity & parallelism primitives needed for large-scale recommender systems (RecSys). It allows authors to train models with large embedding tables sharded across many GPUs. Parallelism primitives that enable easy authoring of large, performant multi-device/multi-node models using hybrid data-parallelism/model-parallelism. The TorchRec sharder can shard embedding tables with different sharding strategies including data-parallel, table-wise, row-wise, table-wise-row-wise, and column-wise sharding. The TorchRec planner can automatically generate optimized sharding plans for models. Pipelined training overlaps dataloading device transfer (copy to GPU), inter-device communications (input_dist), and computation (forward, backward) for increased performance. Optimized kernels for RecSys powered by FBGEMM. Quantization support for reduced precision training and inference. Common modules for RecSys.

Features

  • Built to provide common sparsity & parallelism primitives needed for large-scale recommender systems
  • The TorchRec planner can automatically generate optimized sharding plans for models
  • Torchrec requires Python >= 3.7 and CUDA >= 11.0
  • Experimental binary on Linux for Python 3.7, 3.8 and 3.9 can be installed via pip wheels
  • TorchRec is BSD licensed
  • Quantization support for reduced precision training and inference

Project Samples

Project Activity

See All Activity >

Categories

Machine Learning

License

BSD License

Follow TorchRec

TorchRec Web Site

You Might Also Like
Employee monitoring software with screenshots Icon
Employee monitoring software with screenshots

Clear visibility and insights into how employees work. Even remotely.

Stay productive working at any distance from anywhere with Monitask.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of TorchRec!

Additional Project Details

Programming Language

Python

Related Categories

Python Machine Learning Software

Registered

2022-08-19