Showing 309 open source projects for "train"

View related business solutions
  • Top-Rated Free CRM Software Icon
    Top-Rated Free CRM Software

    216,000+ customers in over 135 countries grow their businesses with HubSpot

    HubSpot is an AI-powered customer platform with all the software, integrations, and resources you need to connect your marketing, sales, and customer service. HubSpot's connected platform enables you to grow your business faster by focusing on what matters most: your customers.
  • Manage Properties Better For Free Icon
    Manage Properties Better For Free

    For small to mid-sized landlords and property managers

    Innago is a free and easy-to-use property management solution. Whether you have 1 unit or 1000, student housing, or commercial properties, Innago is built for you. Our software is designed to save you time and money, so you can spend more time doing the things that matter most.
  • 1
    DeepH-pack

    DeepH-pack

    Deep neural networks for density functional theory Hamiltonian

    DeepH-pack is the official implementation of the DeepH (Deep Hamiltonian) method described in the paper Deep-learning density functional theory Hamiltonian for efficient ab initio electronic-structure calculation and in the Research Briefing. DeepH-pack supports DFT results made by ABACUS, OpenMX, FHI-aims or SIESTA and will support HONPAS.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    Prime QA

    Prime QA

    State-of-the-art Multilingual Question Answering research

    PrimeQA is a public open source repository that enables researchers and developers to train state-of-the-art models for question answering (QA). By using PrimeQA, a researcher can replicate the experiments outlined in a paper published in the latest NLP conference while also enjoying the capability to download pre-trained models (from an online repository) and run them on their own custom data. PrimeQA is built on top of the Transformers toolkit and uses datasets and models that are directly...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    PyBoy

    PyBoy

    Game Boy emulator written in Python

    It is highly recommended to read the report to get a light introduction to Game Boy emulation. But do be aware, that the Python implementation has changed a lot. The report is relevant, even though you want to contribute to another emulator or create your own. If you are looking to make a bot or AI, you can find all the external components in the PyBoy Documentation. There is also a short example on our Wiki page Scripts, AI and Bots as well as in the examples directory. If more features are...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    Denoising Diffusion Probabilistic Model

    Denoising Diffusion Probabilistic Model

    Implementation of Denoising Diffusion Probabilistic Model in Pytorch

    Implementation of Denoising Diffusion Probabilistic Model in Pytorch. It is a new approach to generative modeling that may have the potential to rival GANs. It uses denoising score matching to estimate the gradient of the data distribution, followed by Langevin sampling to sample from the true distribution. If you simply want to pass in a folder name and the desired image dimensions, you can use the Trainer class to easily train a model.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Simplify Purchasing For Your Business Icon
    Simplify Purchasing For Your Business

    Manage what you buy and how you buy it with Order.co, so you have control over your time and money spent.

    Simplify every aspect of buying for your business in Order.co. From sourcing products to scaling purchasing across locations to automating your AP and approvals workstreams, Order.co is the platform of choice for growing businesses.
  • 5
    Kubeflow

    Kubeflow

    Machine Learning Toolkit for Kubernetes

    Kubeflow is an open source Cloud Native machine learning platform based on Google’s internal machine learning pipelines. It seeks to make deployments of machine learning workflows on Kubernetes simple, portable and scalable. With Kubeflow you can deploy best-of-breed open-source systems for ML to diverse infrastructures. You can also take advantage of a number of great features, such as services for managing Jupyter notebooks and support for a TensorFlow Serving container. Wherever you...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    DeepSpeed

    DeepSpeed

    Deep learning optimization library: makes distributed training easy

    DeepSpeed is an easy-to-use deep learning optimization software suite that enables unprecedented scale and speed for Deep Learning Training and Inference. With DeepSpeed you can: 1. Train/Inference dense or sparse models with billions or trillions of parameters 2. Achieve excellent system throughput and efficiently scale to thousands of GPUs 3. Train/Inference on resource constrained GPU systems 4. Achieve unprecedented low latency and high throughput for inference 5. Achieve extreme...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    onepoint

    onepoint

    Assistant tool that integrates coding, writing, and reading functions

    Onepoint is an open-source AI assistant based on Electron, designed to create the ultimate desktop productivity tool. Its initial goal was to develop a smart floating window similar to Apple's intelligent assistant that does not take up desktop space or system performance and can be quickly accessed through global hotkeys for user convenience. With ChatGPT technology, users can continuously train onepoint to generate and reconstruct content with greater accuracy (onpoint), thereby improving...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Transformer Engine

    Transformer Engine

    A library for accelerating Transformer models on NVIDIA GPUs

    ... that can be integrated with other deep-learning libraries to enable FP8 support for Transformers. As the number of parameters in Transformer models continues to grow, training and inference for architectures such as BERT, GPT, and T5 become very memory and compute-intensive. Most deep learning frameworks train with FP32 by default. This is not essential, however, to achieve full accuracy for many deep learning models.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Nixtla Neural Forecast

    Nixtla Neural Forecast

    Scalable and user friendly neural forecasting algorithms.

    NeuralForecast offers a large collection of neural forecasting models focusing on their performance, usability, and robustness. The models range from classic networks like RNNs to the latest transformers: MLP, LSTM, GRU, RNN, TCN, TimesNet, BiTCN, DeepAR, NBEATS, NBEATSx, NHITS, TiDE, DeepNPTS, TSMixer, TSMixerx, MLPMultivariate, DLinear, NLinear, TFT, Informer, AutoFormer, FedFormer, PatchTST, iTransformer, StemGNN, and TimeLLM. There is a shared belief in Neural forecasting methods'...
    Downloads: 0 This Week
    Last Update:
    See Project
  • RMM Software | Remote Monitoring Platform and Tools Icon
    RMM Software | Remote Monitoring Platform and Tools

    Best-in-class automation, scalability, and single-pane IT management.

    Don’t settle when it comes to managing your clients’ IT infrastructure. Exceed their expectations with ConnectWise RMM, our MSP RMM software that provides proactive tools and NOC services—regardless of device environment. With the number of new vulnerabilities rising each year, smart patching procedures have never been more important. We automatically test and deploy patches when they are viable and restrict patches that are harmful. Get better protection for clients while you spend less time managing endpoints and more time growing your business. It’s tough to locate, afford, and retain quality talent. In fact, 81% of IT leaders say it’s hard to find the recruits they need. Add ConnectWise RMM, NOC services and get the expertise and problem resolution you need to become the advisor your clients demand—without adding headcount.
  • 10
    FEDML Open Source

    FEDML Open Source

    The unified and scalable ML library for large-scale training

    A Unified and Scalable Machine Learning Library for Running Training and Deployment Anywhere at Any Scale. TensorOpera AI is the next-gen cloud service for LLMs & Generative AI. It helps developers to launch complex model training, deployment, and federated learning anywhere on decentralized GPUs, multi-clouds, edge servers, and smartphones, easily, economically, and securely. Highly integrated with TensorOpera open source library, TensorOpera AI provides holistic support of three...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    snorkel

    snorkel

    A system for quickly generating training data with weak supervision

    The Snorkel team is now focusing their efforts on Snorkel Flow, an end-to-end AI application development platform based on the core ideas behind Snorkel. The Snorkel project started at Stanford in 2016 with a simple technical bet: that it would increasingly be the training data, not the models, algorithms, or infrastructure, that decided whether a machine learning project succeeded or failed. Given this premise, we set out to explore the radical idea that you could bring mathematical and...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Ludwig AI

    Ludwig AI

    Low-code framework for building custom LLMs, neural networks

    Declarative deep learning framework built for scale and efficiency. Ludwig is a low-code framework for building custom AI models like LLMs and other deep neural networks. Declarative YAML configuration file is all you need to train a state-of-the-art LLM on your data. Support for multi-task and multi-modality learning. Comprehensive config validation detects invalid parameter combinations and prevents runtime failures. Automatic batch size selection, distributed training (DDP, DeepSpeed...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    Chinese-LLaMA-Alpaca 2

    Chinese-LLaMA-Alpaca 2

    Chinese LLaMA-2 & Alpaca-2 Large Model Phase II Project

    This project is developed based on the commercially available large model Llama-2 released by Meta. It is the second phase of the Chinese LLaMA&Alpaca large model project. The Chinese LLaMA-2 base model and the Alpaca-2 instruction fine-tuning large model are open-sourced. These models expand and optimize the Chinese vocabulary on the basis of the original Llama-2, use large-scale Chinese data for incremental pre-training, and further improve the basic semantics and command understanding of...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Chinese-LLaMA-Alpaca-2 v2.0

    Chinese-LLaMA-Alpaca-2 v2.0

    Chinese LLaMA & Alpaca large language model + local CPU/GPU training

    This project has open-sourced the Chinese LLaMA model and the Alpaca large model with instruction fine-tuning to further promote the open research of large models in the Chinese NLP community. Based on the original LLaMA , these models expand the Chinese vocabulary and use Chinese data for secondary pre-training, which further improves the basic semantic understanding of Chinese. At the same time, the Chinese Alpaca model further uses Chinese instruction data for fine-tuning, which...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    CleanVision

    CleanVision

    Automatically find issues in image datasets

    ... to train them, but it is hard to manually identify all of the low-quality data in a big dataset. CleanVision helps you automatically identify common types of data issues lurking in image datasets. This package currently detects issues in the raw images themselves, making it a useful tool for any computer vision task such as: classification, segmentation, object detection, pose estimation, keypoint detection, generative modeling, etc.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    CPT

    CPT

    CPT: A Pre-Trained Unbalanced Transformer

    ... initialize the new version of models with the old version of checkpoints with vocabulary alignment. Token embeddings found in the old checkpoints are copied. And other newly added parameters are randomly initialized. We further train the new CPT & Chinese BART 50K steps with batch size 2048, max-seq-length 1024, peak learning rate 2e-5, and warmup ratio 0.1. Aiming to unify both NLU and NLG tasks, We propose a novel Chinese Pre-trained Un-balanced Transformer (CPT).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    TextBox

    TextBox

    A text generation library with pre-trained language models github.com

    TextBox 2.0 is an up-to-date text generation library based on Python and PyTorch focusing on building a unified and standardized pipeline for applying pre-trained language models to text generation. From a task perspective, we consider 13 common text generation tasks such as translation, story generation, and style transfer, and their corresponding 83 widely-used datasets. From a model perspective, we incorporate 47 pre-trained language models/modules covering the categories of general,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    YData Synthetic

    YData Synthetic

    Synthetic data generators for tabular and time-series data

    A package to generate synthetic tabular and time-series data leveraging state-of-the-art generative models. Synthetic data is artificially generated data that is not collected from real-world events. It replicates the statistical components of real data without containing any identifiable information, ensuring individuals' privacy. This repository contains material related to Generative Adversarial Networks for synthetic data generation, in particular regular tabular data and time-series. It...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    MMGeneration

    MMGeneration

    MMGeneration is a powerful toolkit for generative models

    ... interpolation, GAN projection, and GAN manipulations are integrated into our framework. It's time to play with your GANs! For the highly dynamic training in generative models, we adopt a new way to train dynamic models with MMDDP. A new design for complex loss modules is proposed for customizing the links between modules, which can achieve flexible combinations among different modules. Conditional GANs have been supported in our toolkit. More methods and pre-trained weights will come soon.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    Segmentation Models

    Segmentation Models

    Segmentation models with pretrained backbones. PyTorch

    ... results (higher metric score and faster convergence). It is not necessary in case you train the whole model, not only the decoder. Pytorch Image Models (a.k.a. timm) has a lot of pretrained models and interface which allows using these models as encoders in smp, however, not all models are supported. Input channels parameter allows you to create models, which process tensors with an arbitrary number of channels.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    spacy-transformers

    spacy-transformers

    Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

    ... trained transformer model if you install spacy-transformers. You can also do your own language model pretraining via the spacy pre train command. You can even share your transformer or another contextual embedding model across multiple components, which can make long pipelines several times more efficient. To use transfer learning, you’ll need at least a few annotated examples for what you’re trying to predict.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    WriteFreely

    WriteFreely

    A clean, Markdown-based publishing platform made for writers

    ... writing. There's no news feed, notifications, or unnecessary likes or claps to take you away from your train of thought. You get a distraction-free writing environment, and readers can enjoy a clean reading experience. Reach outside your own site with federation via ActivityPub. WriteFreely lets anyone on Mastodon, Pleroma, or any ActivityPub-enabled service follow your blog, bookmark your posts, and share them with their followers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    diff2html

    diff2html

    Pretty diff to html javascript library (diff2html)

    ..., train and deploy state of the art models powered by the reference open source in natural language processing. Wrapper and helper adding syntax highlight, synchronized scroll, and other nice features. You can use it without syntax highlight or by passing your own implementation with the languages you prefer. Diff2Html can be used in various ways as listed in the distributions section.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    Kui

    Kui

    A hybrid command-line/UI development experience for cloud-native devs

    .... Raven can sync with your favorite cloud news reader including Feedbin, Inoreader and Self hosted RSS Services supporting Google Reader API. Save the articles you like for offline reading to enjoy whether you’re up in the Himalayas or in an underground train. Some companies track the content you’re viewing so they can try to sell you more stuff. That’s not cool. We respect your privacy.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    ktrain

    ktrain

    ktrain is a Python library that makes deep learning AI more accessible

    ktrain is a Python library that makes deep learning and AI more accessible and easier to apply. ktrain is a lightweight wrapper for the deep learning library TensorFlow Keras (and other libraries) to help build, train, and deploy neural networks and other machine learning models. Inspired by ML framework extensions like fastai and ludwig, ktrain is designed to make deep learning and AI more accessible and easier to apply for both newcomers and experienced practitioners. With only a few lines...
    Downloads: 0 This Week
    Last Update:
    See Project