Showing 164 open source projects for "transformers"

View related business solutions
  • Outgrown Windows Task Scheduler? Icon
    Outgrown Windows Task Scheduler?

    Free diagnostic identifies where your workflow is breaking down—with instant analysis of your scheduling environment.

    Windows Task Scheduler wasn't built for complex, cross-platform automation. Get a free diagnostic that shows exactly where things are failing and provides remediation recommendations. Interactive HTML report delivered in minutes.
    Download Free Tool
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • 1
    Transformers

    Transformers

    State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX

    Transformers provides APIs and tools to easily download and train state-of-the-art pre-trained models. Using pre-trained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities. Text, for tasks like text classification, information extraction, question answering, summarization, translation, text generation, in over 100 languages.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 2
    x-transformers

    x-transformers

    A simple but complete full-attention transformer

    ...Proposes adding learned memory key/values prior to attending. They were able to remove feedforwards altogether and attain a similar performance to the original transformers. I have found that keeping the feedforwards and adding the memory key/values leads to even better performance. Proposes adding learned tokens, akin to CLS tokens, named memory tokens, that is passed through the attention layers alongside the input tokens. You can also use the l2 normalized embeddings proposed as part of fixnorm. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 3
    Curated Transformers

    Curated Transformers

    PyTorch library of curated Transformer models and their components

    State-of-the-art transformers, brick by brick. Curated Transformers is a transformer library for PyTorch. It provides state-of-the-art models that are composed of a set of reusable components. Supports state-of-the-art transformer models, including LLMs such as Falcon, Llama, and Dolly v2. Implementing a feature or bugfix benefits all models. For example, all models support 4/8-bit inference through the bitsandbytes library and each model can use the PyTorch meta device to avoid unnecessary allocations and initialization.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    spacy-transformers

    spacy-transformers

    Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

    ...These techniques can be used to import knowledge from raw text into your pipeline, so that your models are able to generalize better from your annotated examples. You can convert word vectors from popular tools like FastText and Gensim, or you can load in any pre trained transformer model if you install spacy-transformers. You can also do your own language model pretraining via the spacy pre train command. You can even share your transformer or another contextual embedding model across multiple components, which can make long pipelines several times more efficient. To use transfer learning, you’ll need at least a few annotated examples for what you’re trying to predict.
    Downloads: 0 This Week
    Last Update:
    See Project
  • AI-generated apps that pass security review Icon
    AI-generated apps that pass security review

    Stop waiting on engineering. Build production-ready internal tools with AI—on your company data, in your cloud.

    Retool lets you generate dashboards, admin panels, and workflows directly on your data. Type something like “Build me a revenue dashboard on my Stripe data” and get a working app with security, permissions, and compliance built in from day one. Whether on our cloud or self-hosted, create the internal software your team needs without compromising enterprise standards or control.
    Try Retool free
  • 5
    Transformers.jl

    Transformers.jl

    Julia Implementation of Transformer models

    Transformers.jl is a Julia library that implements Transformer models for natural language processing tasks. Inspired by architectures like BERT, GPT, and T5, the library offers a modular and flexible interface for building, training, and using transformer-based deep learning models. It supports training from scratch and fine-tuning pretrained models, and integrates with Flux.jl for automatic differentiation and optimization.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Intel Extension for Transformers

    Intel Extension for Transformers

    Build your chatbot within minutes on your favorite device

    Intel Extension for Transformers is an innovative toolkit designed to accelerate Transformer-based models on Intel platforms, including CPUs and GPUs. It offers state-of-the-art compression techniques for Large Language Models (LLMs) and provides tools to build chatbots within minutes on various devices. The extension aims to optimize the performance of Transformer-based models, making them more efficient and accessible.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    DeiT (Data-efficient Image Transformers)
    DeiT (Data-efficient Image Transformers) shows that Vision Transformers can be trained competitively on ImageNet-1k without external data by using strong training recipes and knowledge distillation. Its key idea is a specialized distillation strategy—including a learnable “distillation token”—that lets a transformer learn effectively from a CNN or transformer teacher on modest-scale datasets.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    SageMaker Hugging Face Inference Toolkit

    SageMaker Hugging Face Inference Toolkit

    Library for serving Transformers models on Amazon SageMaker

    SageMaker Hugging Face Inference Toolkit is an open-source library for serving Transformers models on Amazon SageMaker. This library provides default pre-processing, predict and postprocessing for certain Transformers models and tasks. It utilizes the SageMaker Inference Toolkit for starting up the model server, which is responsible for handling inference requests. For the Dockerfiles used for building SageMaker Hugging Face Containers, see AWS Deep Learning Containers. ...
    Downloads: 7 This Week
    Last Update:
    See Project
  • 9
    SetFit

    SetFit

    Efficient few-shot learning with Sentence Transformers

    SetFit is an efficient and prompt-free framework for few-shot fine-tuning of Sentence Transformers. It achieves high accuracy with little labeled data - for instance, with only 8 labeled examples per class on the Customer Reviews sentiment dataset, SetFit is competitive with fine-tuning RoBERTa Large on the full training set of 3k examples.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Atera all-in-one platform IT management software with AI agents Icon
    Atera all-in-one platform IT management software with AI agents

    Ideal for internal IT departments or managed service providers (MSPs)

    Atera’s AI agents don’t just assist, they act. From detection to resolution, they handle incidents and requests instantly, taking your IT management from automated to autonomous.
    Learn More
  • 10
    Adapters

    Adapters

    A Unified Library for Parameter-Efficient Learning

    Adapters is an add-on library to HuggingFace's Transformers, integrating 10+ adapter methods into 20+ state-of-the-art Transformer models with minimal coding overhead for training and inference. Adapters provide a unified interface for efficient fine-tuning and modular transfer learning, supporting a myriad of features like full-precision or quantized training (e.g. Q-LoRA, Q-Bottleneck Adapters, or Q-PrefixTuning), adapter merging via task arithmetics or the composition of multiple adapters via composition blocks, allowing advanced research in parameter-efficient transfer learning for NLP tasks.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    bert4torch

    bert4torch

    An elegent pytorch implement of transformers

    An elegant PyTorch implement of transformers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    GeneralAI

    GeneralAI

    Large-scale Self-supervised Pre-training Across Tasks, Languages, etc.

    Fundamental research to develop new architectures for foundation models and AI, focusing on modeling generality and capability, as well as training stability and efficiency.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    BERTopic

    BERTopic

    Leveraging BERT and c-TF-IDF to create easily interpretable topics

    ...Instead, we can visualize the topics that were generated in a way very similar to LDAvis. By default, the main steps for topic modeling with BERTopic are sentence-transformers, UMAP, HDBSCAN, and c-TF-IDF run in sequence.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 14
    ktrain

    ktrain

    ktrain is a Python library that makes deep learning AI more accessible

    ...Inspired by ML framework extensions like fastai and ludwig, ktrain is designed to make deep learning and AI more accessible and easier to apply for both newcomers and experienced practitioners. With only a few lines of code, ktrain allows you to easily and quickly. ktrain purposely pins to a lower version of transformers to include support for older versions of TensorFlow. If you need a newer version of transformers, it is usually safe for you to upgrade transformers, as long as you do it after installing ktrain. As of v0.30.x, TensorFlow installation is optional and only required if training neural networks.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 15
    DINOv3

    DINOv3

    Reference PyTorch implementation and models for DINOv3

    ...DINOv3 removes the need for complex augmentations or momentum encoders, streamlining the pipeline while maintaining or improving feature quality. The model supports multiple backbone architectures, including Vision Transformers (ViT), and can handle larger image resolutions with improved stability during training. The learned embeddings generalize robustly across tasks like classification, retrieval, and segmentation without fine-tuning, showing state-of-the-art transfer performance among self-supervised models.
    Downloads: 8 This Week
    Last Update:
    See Project
  • 16
    DFlash

    DFlash

    Block Diffusion for Ultra-Fast Speculative Decoding

    ...The project includes support for multiple draft models, example integration code, and scripts to benchmark performance, and it is structured to work with popular model serving stacks like SGLang and the Hugging Face Transformers ecosystem.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 17
    Spark NLP

    Spark NLP

    State of the Art Natural Language Processing

    ...The most widely used NLP library in the enterprise. Spark ML provides a set of machine learning applications that can be built using two main components, estimators and transformers. The estimators have a method that secures and trains a piece of data to such an application. The transformer is generally the result of a fitting process and applies changes to the target dataset. These components have been embedded to be applicable to Spark NLP. Pipelines are a mechanism for combining multiple estimators and transformers in a single workflow. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 18
    The SpeechBrain Toolkit

    The SpeechBrain Toolkit

    A PyTorch-based Speech Toolkit

    ...Competitive or state-of-the-art performance is obtained in various domains. SpeechBrain supports state-of-the-art methods for end-to-end speech recognition, including models based on CTC, CTC+attention, transducers, transformers, and neural language models relying on recurrent neural networks and transformers. Speaker recognition is already deployed in a wide variety of realistic applications. SpeechBrain provides different models for speaker recognition, including X-vector, ECAPA-TDNN, PLDA, and contrastive learning. Spectral masking, spectral mapping, and time-domain enhancement are different methods already available within SpeechBrain. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 19
    nanoGPT

    nanoGPT

    The simplest, fastest repository for training/finetuning models

    ...While simple, it can still train non-trivial models on modern GPUs and generate coherent text. The project has become widely used in tutorials, courses, and experiments for people learning how transformers work under the hood.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    Ling

    Ling

    Ling is a MoE LLM provided and open-sourced by InclusionAI

    ...As more developers and researchers engage with the platform, we can expect rapid advancements and improvements, leading to even more sophisticated applications. Model inference and API code (e.g. integration with Transformers). This collaborative approach accelerates development and ensures that the models remain at the forefront of technology, addressing emerging challenges in various fields.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    InstantCharacter

    InstantCharacter

    Personalize Any Characters with a Scalable Diffusion Transformer

    ...It works by adapting a base image generation model with a lightweight adapter so that you can produce character-preserving generations in various downstream tasks (e.g. changing pose, clothing, scene) without needing full model fine-tuning. Works with huggingface/transformers/diffusers ecosystems.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    Babel

    Babel

    The compiler for writing next generation JavaScript

    Babel is a toolchain that helps you write code in the latest version of JavaScript. It converts ECMAScript 2015+ code into a backwards compatible version of JavaScript that can be run by older JavaScript engines. With Babel you can transform syntax, polyfill features that are missing in your target environment, transform source code and more!
    Downloads: 12 This Week
    Last Update:
    See Project
  • 23
    PydanticAI

    PydanticAI

    Agent Framework / shim to use Pydantic with LLMs

    ...PydanticAI is a Python Agent Framework designed to make it less painful to build production-grade applications with Generative AI. Built by the team behind Pydantic (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor, and many more).
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    LLaMA-Factory

    LLaMA-Factory

    Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)

    LLaMA-Factory is a fine-tuning and training framework for Meta's LLaMA language models. It enables researchers and developers to train and customize LLaMA models efficiently using advanced optimization techniques.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 25
    OuteTTS

    OuteTTS

    Interface for OuteTTS models

    ...It provides a high-level Interface API that wraps model configuration, speaker handling, and audio generation so you can focus on integrating speech into your application rather than wiring up low-level engines. The project supports multiple backends including llama.cpp (Python bindings and server), Hugging Face Transformers, ExLlamaV2, VLLM and a JavaScript interface via Transformers.js, allowing it to run on CPUs, NVIDIA CUDA GPUs, AMD ROCm, Vulkan-capable GPUs, and Apple Metal. It also includes a notion of speaker profiles: you can create a speaker from a short audio sample, save it as JSON, and reuse it for consistent voice identity across generations and sessions. ...
    Downloads: 1 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • 4
  • 5
  • Next