Torchreid is a library for deep-learning person re-identification, written in PyTorch and developed for our ICCV’19 project, Omni-Scale Feature Learning for Person Re-Identification. In "deep-person-reid/scripts/", we provide a unified interface to train and test a model. See "scripts/main.py" and "scripts/default_config.py" for more details. The folder "configs/" contains some predefined configs which you can use as a starting point. The code will automatically (download and) load the ImageNet pretrained weights. After the training is done, the model will be saved as "log/osnet_x1_0_market1501_softmax_cosinelr/model.pth.tar-250". Under the same folder, you can find the tensorboard file. Different from the same-domain setting, here we replace random_erase with color_jitter. This can improve the generalization performance on the unseen target dataset.

Features

  • Multi-GPU training
  • Support both image- and video-reid
  • End-to-end training and evaluation
  • Incredibly easy preparation of reid datasets
  • Multi-dataset training
  • Cross-dataset evaluation
  • Standard protocol used by most research papers

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow Torchreid

Torchreid Web Site

Other Useful Business Software
Orchestrate Your AI Agents with Zenflow Icon
Orchestrate Your AI Agents with Zenflow

The multi-agent workflow engine for modern teams. Zenflow executes coding, testing, and verification with deep repo awareness

Zenflow orchestrates AI agents like a real engineering system. With parallel execution, spec-driven workflows, and deep multi-repo understanding, agents plan, implement, test, and verify end-to-end. Upgrade to AI workflows that work the way your team does.
Try free now
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Torchreid!

Additional Project Details

Operating Systems

Windows

Programming Language

Python

Related Categories

Python Machine Learning Software, Python Deep Learning Frameworks

Registered

2022-08-04