MoCo is an open source PyTorch implementation developed by Facebook AI Research (FAIR) for the papers “Momentum Contrast for Unsupervised Visual Representation Learning” (He et al., 2019) and “Improved Baselines with Momentum Contrastive Learning” (Chen et al., 2020). It introduces Momentum Contrast (MoCo), a scalable approach to self-supervised learning that enables visual representation learning without labeled data. The core idea of MoCo is to maintain a dynamic dictionary with a momentum-updated encoder, allowing efficient contrastive learning across large batches. The repository includes implementations for both MoCo v1 and MoCo v2, the latter improving training stability and performance through architectural and augmentation enhancements. Training is optimized for distributed multi-GPU environments, using DistributedDataParallel for speed and simplicity.

Features

  • PyTorch implementation of MoCo v1 and v2 for unsupervised learning
  • Momentum encoder mechanism for scalable contrastive representation learning
  • Supports distributed multi-GPU training via DistributedDataParallel
  • Pre-trained ResNet-50 models available for evaluation and transfer learning
  • Includes linear evaluation and object detection transfer examples
  • Minimal modification from official PyTorch ImageNet code for easy integration

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow MoCo (Momentum Contrast)

MoCo (Momentum Contrast) Web Site

Other Useful Business Software
Fully Managed MySQL, PostgreSQL, and SQL Server Icon
Fully Managed MySQL, PostgreSQL, and SQL Server

Automatic backups, patching, replication, and failover. Focus on your app, not your database.

Cloud SQL handles your database ops end to end, so you can focus on your app.
Try Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of MoCo (Momentum Contrast)!

Additional Project Details

Operating Systems

Mac

Programming Language

Python

Related Categories

Python Deep Learning Frameworks

Registered

2025-10-07