Showing 7 open source projects for "free linux ecu tuning"

View related business solutions
  • Our Free Plans just got better! | Auth0 by Okta Icon
    Our Free Plans just got better! | Auth0 by Okta

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your secuirty. Auth0 now, thank yourself later.
    Try free now
  • Monitor the status and performance of any IT environment with NMIS Icon
    Monitor the status and performance of any IT environment with NMIS

    NMIS monitors an organization’s IT environment, helps identify and rectify faults, and provides valuable information for IT planning.

    Trusted by thousands of IT teams worldwide, The NMIS platform offers comprehensive network management, handling faults, performance, and configurations with ease.
    Get a Free Trial
  • 1
    SetFit

    SetFit

    Efficient few-shot learning with Sentence Transformers

    SetFit is an efficient and prompt-free framework for few-shot fine-tuning of Sentence Transformers. It achieves high accuracy with little labeled data - for instance, with only 8 labeled examples per class on the Customer Reviews sentiment dataset, SetFit is competitive with fine-tuning RoBERTa Large on the full training set of 3k examples.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    Guidance

    Guidance

    A guidance language for controlling large language models

    Guidance is an efficient programming paradigm for steering language models. With Guidance, you can control how output is structured and get high-quality output for your use case—while reducing latency and cost vs. conventional prompting or fine-tuning. It allows users to constrain generation (e.g. with regex and CFGs) as well as to interleave control (conditionals, loops, tool use) and generation seamlessly.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    DeepSeek-V3

    DeepSeek-V3

    Powerful AI language model (MoE) optimized for efficiency/performance

    DeepSeek-V3 is a robust Mixture-of-Experts (MoE) language model developed by DeepSeek, featuring a total of 671 billion parameters, with 37 billion activated per token. It employs Multi-head Latent Attention (MLA) and the DeepSeekMoE architecture to enhance computational efficiency. The model introduces an auxiliary-loss-free load balancing strategy and a multi-token prediction training objective to boost performance. Trained on 14.8 trillion diverse, high-quality tokens, DeepSeek-V3 underwent...
    Downloads: 25 This Week
    Last Update:
    See Project
  • 4
    gpt-2-simple

    gpt-2-simple

    Python package to easily retrain OpenAI's GPT-2 text-generating model

    A simple Python package that wraps existing model fine-tuning and generation scripts for OpenAI's GPT-2 text generation model (specifically the "small" 124M and "medium" 355M hyperparameter versions). Additionally, this package allows easier generation of text, generating to a file for easy curation, allowing for prefixes to force the text to start with a given phrase. For finetuning, it is strongly recommended to use a GPU, although you can generate using a CPU (albeit much more slowly...
    Downloads: 5 This Week
    Last Update:
    See Project
  • Deliver secure remote access with OpenVPN. Icon
    Deliver secure remote access with OpenVPN.

    Trusted by nearly 20,000 customers worldwide, and all major cloud providers.

    OpenVPN's products provide scalable, secure remote access — giving complete freedom to your employees to work outside the office while securely accessing SaaS, the internet, and company resources.
    Get started — no credit card required.
  • 5
    Neuro-comma

    Neuro-comma

    Punctuation restoration production-ready model for Russian language

    This library was developed with the idea to help us to create punctuation restoration models to memorize trained parameters, data, training visualization, etc. The Library doesn't use any high-level frameworks, such as PyTorch-lightning or Keras, to reduce the level entry threshold. Feel free to fork this repo and edit model or dataset classes for your purposes. Our team always uses the latest version and features of Python. We started with Python 3.9, but realized, that there is no FastAPI...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6

    pyPIDTuneMethods

    PID controller design and tuning

    pyPIDTuneMethods is a free python-based tool for PID controller design and tuning. pyPIDTuneMethods is based on scipy (numpy, scipy) <http://www.scipy.org/ >, guiqwt <http://code.google.com/p/guiqwt/ >, python control <http://python-control.org/> and pyQt5 <http://www.riverbankcomputing.co.uk >. If you are not familiar with python you can use winpython <http://winpython.sourceforge.net/>. You can also download it from pypi <https://pypi.org/project/pyPIDTuneMethods/>. You may also just...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Collective Mind Technology

    Collective Mind Technology

    plugin-based framework for systematic and reproducible experimentation

    New version moved to http://github.com/ctuning/ck Collective Mind framework (cM) is an open-source plugin-based schema-free repository and infrastructure for collaborative, systematic and reproducible research and experimentation. This 3rd version (started in 2006) helps to implement, preserve, share and reproduce the whole experimental setup as connected modules and data. cM uses crowdsourcing to leverage knowledge and computational resources of multiple users. For example, it includes...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next