Showing 70 open source projects for "tuning"

View related business solutions
  • Our Free Plans just got better! | Auth0 by Okta Icon
    Our Free Plans just got better! | Auth0 by Okta

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your secuirty. Auth0 now, thank yourself later.
    Try free now
  • Free CRM Software With Something for Everyone Icon
    Free CRM Software With Something for Everyone

    216,000+ customers in over 135 countries grow their businesses with HubSpot

    Think CRM software is just about contact management? Think again. HubSpot CRM has free tools for everyone on your team, and it’s 100% free. Here’s how our free CRM solution makes your job easier.
    Get free CRM
  • 1
    Chinese-LLaMA-Alpaca 2

    Chinese-LLaMA-Alpaca 2

    Chinese LLaMA-2 & Alpaca-2 Large Model Phase II Project

    This project is developed based on the commercially available large model Llama-2 released by Meta. It is the second phase of the Chinese LLaMA&Alpaca large model project. The Chinese LLaMA-2 base model and the Alpaca-2 instruction fine-tuning large model are open-sourced. These models expand and optimize the Chinese vocabulary on the basis of the original Llama-2, use large-scale Chinese data for incremental pre-training, and further improve the basic semantics and command understanding...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    ModelScope

    ModelScope

    Bring the notion of Model-as-a-Service to life

    ... unified experience to explore state-of-the-art models spanning across domains such as CV, NLP, Speech, Multi-Modality, and Scientific-computation. Model contributors of different areas can integrate models into the ModelScope ecosystem through the layered APIs, allowing easy and unified access to their models. Once integrated, model inference, fine-tuning, and evaluations can be done with only a few lines of code.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Open Source Vizier

    Open Source Vizier

    Python-based research interface for blackbox

    Open Source (OSS) Vizier is a Python-based interface for blackbox optimization and research, based on Google’s original internal Vizier, one of the first hyperparameter tuning services designed to work at scale. Allows a user to setup an OSS Vizier Server, which can host black-box optimization algorithms to serve multiple clients simultaneously in a fault-tolerant manner to tune their objective functions. Defines abstractions and utilities for implementing new optimization algorithms...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    Ray

    Ray

    A unified framework for scalable computing

    Modern workloads like deep learning and hyperparameter tuning are compute-intensive and require distributed or parallel execution. Ray makes it effortless to parallelize single machine code — go from a single CPU to multi-core, multi-GPU or multi-node with minimal code changes. Accelerate your PyTorch and Tensorflow workload with a more resource-efficient and flexible distributed execution framework powered by Ray. Accelerate your hyperparameter search workloads with Ray Tune. Find the best...
    Downloads: 0 This Week
    Last Update:
    See Project
  • The #1 Embedded Analytics Solution for SaaS Teams. Icon
    The #1 Embedded Analytics Solution for SaaS Teams.

    Qrvey saves engineering teams time and money with a turnkey multi-tenant solution connecting your data warehouse to your SaaS application.

    Qrvey’s comprehensive embedded analytics software enables you to design more customizable analytics experiences for your end users.
    Try Developer Playground
  • 5
    auto-sklearn

    auto-sklearn

    Automated machine learning with scikit-learn

    auto-sklearn is an automated machine learning toolkit and a drop-in replacement for a scikit-learn estimator. auto-sklearn frees a machine learning user from algorithm selection and hyperparameter tuning. It leverages recent advantages in Bayesian optimization, meta-learning and ensemble construction. Auto-sklearn 2.0 includes latest research on automatically configuring the AutoML system itself and contains a multitude of improvements which speed up the fitting the AutoML system. auto-sklearn...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Guidance

    Guidance

    A guidance language for controlling large language models

    Guidance is an efficient programming paradigm for steering language models. With Guidance, you can control how output is structured and get high-quality output for your use case—while reducing latency and cost vs. conventional prompting or fine-tuning. It allows users to constrain generation (e.g. with regex and CFGs) as well as to interleave control (conditionals, loops, tool use) and generation seamlessly.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Backtrack Sampler

    Backtrack Sampler

    An easy-to-understand framework for LLM samplers

    Backtrack Sampler is a framework designed for experimenting with custom sampling strategies for language models (LLMs), enabling the ability to rewind and revise generated tokens. It allows developers to create and test their own token generation strategies by providing a base structure for manipulating logits and probabilities, making it a flexible tool for those interested in fine-tuning the behavior of LLMs.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Lightning Bolts

    Lightning Bolts

    Toolbox of models, callbacks, and datasets for AI/ML researchers

    Bolts package provides a variety of components to extend PyTorch Lightning, such as callbacks & datasets, for applied research and production. Torch ORT converts your model into an optimized ONNX graph, speeding up training & inference when using NVIDIA or AMD GPUs. We can introduce sparsity during fine-tuning with SparseML, which ultimately allows us to leverage the DeepSparse engine to see performance improvements at inference time.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Advanced Solutions Lab

    Advanced Solutions Lab

    This repos contains notebooks for the Advanced Solutions Lab

    This repository contains Jupyter notebooks meant to be run on Vertex AI. This is maintained by Google Cloud’s Advanced Solutions Lab (ASL) team. Vertex AI is the next-generation AI Platform on the Google Cloud Platform. The material covered in this repo will take a software engineer with no exposure to machine learning to an advanced level.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Start building the next generation of GenAI apps today Icon
    Start building the next generation of GenAI apps today

    MongoDB and Google Cloud bring together powerful technologies that enable you to confidently build GenAI experiences.

    MongoDB Atlas is a fully-managed developer data platform built by developers, for developers. With tight integration to Google Cloud services such as Vertex AI and BigQuery, you can accelerate application deployment to stay at the forefront of AI innovation.
    Learn More
  • 10
    Chronos Forecasting

    Chronos Forecasting

    Pretrained (Language) Models for Probabilistic Time Series Forecasting

    Chronos is a family of pretrained time series forecasting models based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    HDBSCAN

    HDBSCAN

    A high performance implementation of HDBSCAN clustering

    HDBSCAN - Hierarchical Density-Based Spatial Clustering of Applications with Noise. Performs DBSCAN over varying epsilon values and integrates the result to find a clustering that gives the best stability over epsilon. This allows HDBSCAN to find clusters of varying densities (unlike DBSCAN), and be more robust to parameter selection. In practice this means that HDBSCAN returns a good clustering straight away with little or no parameter tuning -- and the primary parameter, minimum cluster size...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    KerasTuner

    KerasTuner

    A Hyperparameter Tuning Library for Keras

    KerasTuner is an easy-to-use, scalable hyperparameter optimization framework that solves the pain points of hyperparameter search. Easily configure your search space with a define-by-run syntax, then leverage one of the available search algorithms to find the best hyperparameter values for your models. KerasTuner comes with Bayesian Optimization, Hyperband, and Random Search algorithms built-in, and is also designed to be easy for researchers to extend in order to experiment with new search...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    Transformer Reinforcement Learning X

    Transformer Reinforcement Learning X

    A repo for distributed training of language models with Reinforcement

    trlX is a distributed training framework designed from the ground up to focus on fine-tuning large language models with reinforcement learning using either a provided reward function or a reward-labeled dataset. Training support for Hugging Face models is provided by Accelerate-backed trainers, allowing users to fine-tune causal and T5-based language models of up to 20B parameters, such as facebook/opt-6.7b, EleutherAI/gpt-neox-20b, and google/flan-t5-xxl. For models beyond 20B parameters, trlX...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    TradeMaster

    TradeMaster

    TradeMaster is an open-source platform for quantitative trading

    TradeMaster is a first-of-its-kind, best-in-class open-source platform for quantitative trading (QT) empowered by reinforcement learning (RL), which covers the full pipeline for the design, implementation, evaluation and deployment of RL-based algorithms. TradeMaster is composed of 6 key modules: 1) multi-modality market data of different financial assets at multiple granularities; 2) whole data preprocessing pipeline; 3) a series of high-fidelity data-driven market simulators for mainstream...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    unit-minions

    unit-minions

    AI R&D Efficiency Improvement Research: Do-It-Yourself Training LoRA

    "AI R&D Efficiency Improvement Research: Do-It-Yourself Training LoRA", including Llama (Alpaca LoRA) model, ChatGLM (ChatGLM Tuning) related Lora training. Training content: user story generation, test code generation, code-assisted generation, text to SQL, text generation code.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    Nixtla Neural Forecast

    Nixtla Neural Forecast

    Scalable and user friendly neural forecasting algorithms.

    NeuralForecast offers a large collection of neural forecasting models focusing on their performance, usability, and robustness. The models range from classic networks like RNNs to the latest transformers: MLP, LSTM, GRU, RNN, TCN, TimesNet, BiTCN, DeepAR, NBEATS, NBEATSx, NHITS, TiDE, DeepNPTS, TSMixer, TSMixerx, MLPMultivariate, DLinear, NLinear, TFT, Informer, AutoFormer, FedFormer, PatchTST, iTransformer, StemGNN, and TimeLLM. There is a shared belief in Neural forecasting methods'...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    MLJAR Studio

    MLJAR Studio

    Python package for AutoML on Tabular Data with Feature Engineering

    ... the machine learning models, and perform hyper-parameter tuning to find the best model. It is no black box, as you can see exactly how the ML pipeline is constructed (with a detailed Markdown report for each ML model).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    Merlion

    Merlion

    A Machine Learning Framework for Time Series Intelligence

    Merlion is a Python library for time series intelligence. It provides an end-to-end machine learning framework that includes loading and transforming data, building and training models, post-processing model outputs, and evaluating model performance. It supports various time series learning tasks, including forecasting, anomaly detection, and change point detection for both univariate and multivariate time series. This library aims to provide engineers and researchers a one-stop solution to...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    OSCAL

    OSCAL

    Open Security Controls Assessment Language (OSCAL)

    ... that are generic enough to capture the breadth of data in scope (controls specifications), while also capable of ad-hoc tuning and extension to support peculiarities of both (industry or sector) standards and new control types. The OSCAL website provides an overview of the OSCAL project, including an XML and JSON schema reference, examples, and other resources.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    SuperAGI

    SuperAGI

    A dev-first open source autonomous AI agent framework

    An open-source autonomous AI framework to enable you to develop and deploy useful autonomous agents quickly & reliably. Join a community of developers constantly contributing to make SuperAGI better. Access your agents through a graphical user interface. Interact with agents by giving them input, permissions, etc. Agents typically learn and improve their performance over time with feedback loops. Run multiple agents simultaneously to improve efficiency and productivity. Connect to multiple...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    talos

    talos

    Hyperparameter Optimization for TensorFlow, Keras and PyTorch

    Talos radically changes the ordinary Keras, TensorFlow (tf.keras), and PyTorch workflow by fully automating hyperparameter tuning and model evaluation. Talos exposes Keras and TensorFlow (tf.keras) and PyTorch functionality entirely and there is no new syntax or templates to learn. Talos is made for data scientists and data engineers that want to remain in complete control of their TensorFlow (tf.keras) and PyTorch models, but are tired of mindless parameter hopping and confusing optimization...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    SAHI

    SAHI

    A lightweight vision library for performing large object detection

    ... challenge in surveillance applications. Such objects are represented by small number of pixels in the image and lack sufficient details, making them difficult to detect using conventional detectors. In this work, an open-source framework called Slicing Aided Hyper Inference (SAHI) is proposed that provides a generic slicing aided inference and fine-tuning pipeline for small object detection.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    ClearML

    ClearML

    Streamline your ML workflow

    ... and other workflows with ClearML powerful and versatile set of classes and methods. The ClearML Server storing experiment, model, and workflow data, and supports the Web UI experiment manager, and ML-Ops automation for reproducibility and tuning. It is available as a hosted service and open source for you to deploy your own ClearML Server. The ClearML Agent for ML-Ops orchestration, experiment and workflow reproducibility, and scalability.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    AutoGluon

    AutoGluon

    AutoGluon: AutoML for Image, Text, and Tabular Data

    AutoGluon enables easy-to-use and easy-to-extend AutoML with a focus on automated stack ensembling, deep learning, and real-world applications spanning image, text, and tabular data. Intended for both ML beginners and experts, AutoGluon enables you to quickly prototype deep learning and classical ML solutions for your raw data with a few lines of code. Automatically utilize state-of-the-art techniques (where appropriate) without expert knowledge. Leverage automatic hyperparameter tuning, model...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    Recommenders

    Recommenders

    Best practices on recommendation systems

    The Recommenders repository provides examples and best practices for building recommendation systems, provided as Jupyter notebooks. The module reco_utils contains functions to simplify common tasks used when developing and evaluating recommender systems. Several utilities are provided in reco_utils to support common tasks such as loading datasets in the format expected by different algorithms, evaluating model outputs, and splitting training/test data. Implementations of several...
    Downloads: 0 This Week
    Last Update:
    See Project