Showing 2 open source projects for "training"

View related business solutions
  • Earn up to 16% annual interest with Nexo. Icon
    Earn up to 16% annual interest with Nexo.

    More flexibility. More control.

    Generate interest, access liquidity without selling, and execute trades seamlessly. All in one platform. Geographic restrictions, eligibility, and terms apply.
    Get started with Nexo.
  • Forever Free Full-Stack Observability | Grafana Cloud Icon
    Forever Free Full-Stack Observability | Grafana Cloud

    Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.

    Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
    Create free account
  • 1
    CycleGAN and pix2pix in PyTorch

    CycleGAN and pix2pix in PyTorch

    Image-to-Image Translation in PyTorch

    ...This repo gives developers and researchers a convenient, modern (PyTorch-based) platform to train and test these methods — supporting both paired datasets (input to output) and unpaired datasets (domain-to-domain) with minimal changes. The code supports standard training and inference pipelines, and as of recent updates, compatibility with the latest Python and PyTorch versions (e.g. Python 3.11, PyTorch 2.4) as well as support for distributed/multi-GPU training for scalable workflows. Because of its flexibility, users can apply it to many tasks: e.g. style transfer between domains (e.g. season changes, art-to-photo, etc.), mapping sketches/edges to real images, image colorization, day-to-night, photo enhancement, and more.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    UnsupervisedMT

    UnsupervisedMT

    Phrase-Based & Neural Unsupervised Machine Translation

    ...Beyond the core EMNLP 2018 setup, the codebase exposes additional, optional capabilities such as multi-language training, language model pretraining with shared parameters, and adversarial training.
    Downloads: 3 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB