Showing 89 open source projects for "model train design"

View related business solutions
  • Atera all-in-one platform IT management software with AI agents Icon
    Atera all-in-one platform IT management software with AI agents

    Ideal for internal IT departments or managed service providers (MSPs)

    Atera’s AI agents don’t just assist, they act. From detection to resolution, they handle incidents and requests instantly, taking your IT management from automated to autonomous.
    Learn More
  • Smart Business Texting that Generates Pipeline Icon
    Smart Business Texting that Generates Pipeline

    Create and convert pipeline at scale through industry leading SMS campaigns, automation, and conversation management.

    TextUs is the leading text messaging service provider for businesses that want to engage in real-time conversations with customers, leads, employees and candidates. Text messaging is one of the most engaging ways to communicate with customers, candidates, employees and leads. 1:1, two-way messaging encourages response and engagement. Text messages help teams get 10x the response rate over phone and email. Business text messaging has become a more viable form of communication than traditional mediums. The TextUs user experience is intentionally designed to resemble the familiar SMS inbox, allowing users to easily manage contacts, conversations, and campaigns. Work right from your desktop with the TextUs web app or use the Chrome extension alongside your ATS or CRM. Leverage the mobile app for on-the-go sending and responding.
    Learn More
  • 1
    OpenVINO Training Extensions

    OpenVINO Training Extensions

    Trainable models and NN optimization tools

    OpenVINO™ Training Extensions provide a convenient environment to train Deep Learning models and convert them using the OpenVINO™ toolkit for optimized inference. When ote_cli is installed in the virtual environment, you can use the ote command line interface to perform various actions for templates related to the chosen task type, such as running, training, evaluating, exporting, etc. ote train trains a model (a particular model template) on a dataset and saves results in two files. ote optimize optimizes a pre-trained model using NNCF or POT depending on the model format. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    SageMaker Training Toolkit

    SageMaker Training Toolkit

    Train machine learning models within Docker containers

    Train machine learning models within a Docker container using Amazon SageMaker. Amazon SageMaker is a fully managed service for data science and machine learning (ML) workflows. You can use Amazon SageMaker to simplify the process of building, training, and deploying ML models. To train a model, you can include your training script and dependencies in a Docker container that runs your training code.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Denoising Diffusion Probabilistic Model

    Denoising Diffusion Probabilistic Model

    Implementation of Denoising Diffusion Probabilistic Model in Pytorch

    Implementation of Denoising Diffusion Probabilistic Model in Pytorch. It is a new approach to generative modeling that may have the potential to rival GANs. It uses denoising score matching to estimate the gradient of the data distribution, followed by Langevin sampling to sample from the true distribution. If you simply want to pass in a folder name and the desired image dimensions, you can use the Trainer class to easily train a model.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    Determined

    Determined

    Determined, deep learning training platform

    ...Deploy your model using Determined's built-in model registry. Easily share on-premise or cloud GPUs with your team. Determined’s cluster scheduling offers first-class support for deep learning and seamless spot instance support. Check out examples of how you can use Determined to train popular deep learning models at scale.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Desktop and Mobile Device Management Software Icon
    Desktop and Mobile Device Management Software

    It's a modern take on desktop management that can be scaled as per organizational needs.

    Desktop Central is a unified endpoint management (UEM) solution that helps in managing servers, laptops, desktops, smartphones, and tablets from a central location.
    Learn More
  • 5
    Autodistill

    Autodistill

    Images to inference with no labeling

    Autodistill uses big, slower foundation models to train small, faster supervised models. Using autodistill, you can go from unlabeled images to inference on a custom model running at the edge with no human intervention in between. You can use Autodistill on your own hardware, or use the Roboflow hosted version of Autodistill to label images in the cloud.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    GluonTS

    GluonTS

    Probabilistic time series modeling in Python

    GluonTS is a Python package for probabilistic time series modeling, focusing on deep learning based models. GluonTS requires Python 3.6 or newer, and the easiest way to install it is via pip. We train a DeepAR-model and make predictions using the simple "airpassengers" dataset. The dataset consists of a single time-series, containing monthly international passengers between the years 1949 and 1960, a total of 144 values (12 years * 12 months). We split the dataset into train and test parts, by removing the last three years (36 months) from the train data. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Colossal-AI

    Colossal-AI

    Making large AI models cheaper, faster and more accessible

    The Transformer architecture has improved the performance of deep learning models in domains such as Computer Vision and Natural Language Processing. Together with better performance come larger model sizes. This imposes challenges to the memory wall of the current accelerator hardware such as GPU. It is never ideal to train large models such as Vision Transformer, BERT, and GPT on a single GPU or a single machine. There is an urgent demand to train models in a distributed environment. However, distributed training, especially model parallelism, often requires domain expertise in computer systems and architecture. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Hivemind

    Hivemind

    Decentralized deep learning in PyTorch. Built to train models

    ...Decentralized parameter averaging: iteratively aggregate updates from multiple workers without the need to synchronize across the entire network. Train neural networks of arbitrary size: parts of their layers are distributed across the participants with the Decentralized Mixture-of-Experts. If you have succesfully trained a model or created a downstream repository with the help of our library, feel free to submit a pull request that adds your project to the list.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Tokenizers

    Tokenizers

    Fast State-of-the-Art Tokenizers optimized for Research and Production

    ...Even with destructive normalization, it’s always possible to get the part of the original sentence that corresponds to any token. Does all the pre-processing: Truncation, Padding, add the special tokens your model needs.
    Downloads: 1 This Week
    Last Update:
    See Project
  • Total Network Visibility for Network Engineers and IT Managers Icon
    Total Network Visibility for Network Engineers and IT Managers

    Network monitoring and troubleshooting is hard. TotalView makes it easy.

    This means every device on your network, and every interface on every device is automatically analyzed for performance, errors, QoS, and configuration.
    Learn More
  • 10
    Transformers

    Transformers

    State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX

    Transformers provides APIs and tools to easily download and train state-of-the-art pre-trained models. Using pre-trained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities. Text, for tasks like text classification, information extraction, question answering, summarization, translation, text generation, in over 100 languages. ...
    Downloads: 6 This Week
    Last Update:
    See Project
  • 11
    deepfakes_faceswap

    deepfakes_faceswap

    Deepfakes Software For All

    Faceswap is the leading free and open source multi-platform deepfakes software. When faceswapping was first developed and published, the technology was groundbreaking, it was a huge step in AI development. It was also completely ignored outside of academia because the code was confusing and fragmentary. It required a thorough understanding of complicated AI techniques and took a lot of effort to figure it out. Until one individual brought it together into a single, cohesive collection.
    Downloads: 3 This Week
    Last Update:
    See Project
  • 12
    AIMET

    AIMET

    AIMET is a library that provides advanced quantization and compression

    Qualcomm Innovation Center (QuIC) is at the forefront of enabling low-power inference at the edge through its pioneering model-efficiency research. QuIC has a mission to help migrate the ecosystem toward fixed-point inference. With this goal, QuIC presents the AI Model Efficiency Toolkit (AIMET) - a library that provides advanced quantization and compression techniques for trained neural network models. AIMET enables neural networks to run more efficiently on fixed-point AI hardware...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 13
    PyG

    PyG

    Graph Neural Network Library for PyTorch

    ...All it takes is 10-20 lines of code to get started with training a GNN model (see the next section for a quick tour).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Imagen - Pytorch

    Imagen - Pytorch

    Implementation of Imagen, Google's Text-to-Image Neural Network

    ...It is the new SOTA for text-to-image synthesis. Architecturally, it is actually much simpler than DALL-E2. It consists of a cascading DDPM conditioned on text embeddings from a large pre-trained T5 model (attention network). It also contains dynamic clipping for improved classifier-free guidance, noise level conditioning, and a memory-efficient unit design. It appears neither CLIP nor prior network is needed after all. And so research continues. For simpler training, you can directly supply text strings instead of precomputing text encodings. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    Kubeflow

    Kubeflow

    Machine Learning Toolkit for Kubernetes

    Kubeflow is an open source Cloud Native machine learning platform based on Google’s internal machine learning pipelines. It seeks to make deployments of machine learning workflows on Kubernetes simple, portable and scalable. With Kubeflow you can deploy best-of-breed open-source systems for ML to diverse infrastructures. You can also take advantage of a number of great features, such as services for managing Jupyter notebooks and support for a TensorFlow Serving container. Wherever you...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    Materials Discovery: GNoME

    Materials Discovery: GNoME

    AI discovers 520000 stable inorganic crystal structures for research

    ...Using GNoME, DeepMind identified 381,000 new stable materials, later expanding the dataset to include over 520,000 materials within 1 meV/atom of the convex hull as of August 2024. The repository provides datasets, model definitions, and interactive Colabs for exploring these materials, computing decomposition energies, and visualizing chemical families. Additionally, it includes JAX-based implementations of GNoME and Nequip—the latter being used to train interatomic potentials for dynamic simulations.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 17
    BudouX

    BudouX

    Standalone, small, language-neutral

    ...It works with no dependency on third-party word segmenters such as Google cloud natural language API. It is small. It takes only around 15 KB including its machine learning model. It's reasonable to use it even on the client-side. It is language-neutral. You can train a model for any language by feeding a dataset to BudouX’s training script.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    TextAttack

    TextAttack

    Python framework for adversarial attacks, and data augmentation

    Generating adversarial examples for NLP models. TextAttack is a Python framework for adversarial attacks, data augmentation, and model training in NLP.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    InterpretML

    InterpretML

    Fit interpretable models. Explain blackbox machine learning

    In the beginning, machines learned in darkness, and data scientists struggled in the void to explain them. InterpretML is an open-source package that incorporates state-of-the-art machine-learning interpretability techniques under one roof. With this package, you can train interpretable glass box models and explain black box systems. InterpretML helps you understand your model's global behavior, or understand the reasons behind individual predictions.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 20
    Raster Vision

    Raster Vision

    Open source framework for deep learning satellite and aerial imagery

    Raster Vision is an open source framework for Python developers building computer vision models on satellite, aerial, and other large imagery sets (including oblique drone imagery). There is built-in support for chip classification, object detection, and semantic segmentation using PyTorch. Raster Vision allows engineers to quickly and repeatably configure pipelines that go through core components of a machine learning workflow: analyzing training data, creating training chips, training...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21

    LightGBM

    Gradient boosting framework based on decision tree algorithms

    LightGBM or Light Gradient Boosting Machine is a high-performance, open source gradient boosting framework based on decision tree algorithms. Compared to other boosting frameworks, LightGBM offers several advantages in terms of speed, efficiency and accuracy. Parallel experiments have shown that LightGBM can attain linear speed-up through multiple machines for training in specific settings, all while consuming less memory. LightGBM supports parallel and GPU learning, and can handle...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 22
    CML

    CML

    Continuous Machine Learning | CI/CD for ML

    Continuous Machine Learning (CML) is an open-source CLI tool for implementing continuous integration & delivery (CI/CD) with a focus on MLOps. Use it to automate development workflows, including machine provisioning, model training and evaluation, comparing ML experiments across project history, and monitoring changing datasets. CML can help train and evaluate models, and then generate a visual report with results and metrics, automatically on every pull request.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 23
    Opacus

    Opacus

    Training PyTorch models with differential privacy

    Opacus is a library that enables training PyTorch models with differential privacy. It supports training with minimal code changes required on the client, has little impact on training performance, and allows the client to online track the privacy budget expended at any given moment. Vectorized per-sample gradient computation that is 10x faster than micro batching. Supports most types of PyTorch models and can be used with minimal modification to the original neural network. Open source,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    Metarank

    Metarank

    A low code Machine Learning service that personalizes articles

    Metarank is a service that can personalize any type of content: product listings, articles, recommendations and search results in 3 easy steps with a few lines of code. It’s often considered "too risky" to spend 6+ months on an in-house moonshot project to reinvent the wheel without an experienced team and no existing open-source tools. Metarank makes it easy not only for Amazon to do personalization but for everyone else. Ingest historical item listings, clicks and item metadata so Metarank...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    mlforecast

    mlforecast

    Scalable machine learning for time series forecasting

    ...Instead of writing custom code to build lagged features, rolling statistics, and date-based predictors, mlforecast generates those automatically based on a simple configuration. It supports multi-series forecasting, meaning you can train one model that forecasts many time series at once (common in retail, demand forecasting, etc.), rather than one model per series. The library is built to scale: behind the scenes, it can leverage distributed computing frameworks (Spark, Dask, Ray) when datasets or the number of series grow large.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • 4
  • Next