Showing 83 open source projects for "transformers"

View related business solutions
  • Bright Data - All in One Platform for Proxies and Web Scraping Icon
    Bright Data - All in One Platform for Proxies and Web Scraping

    Say goodbye to blocks, restrictions, and CAPTCHAs

    Bright Data offers the highest quality proxies with automated session management, IP rotation, and advanced web unlocking technology. Enjoy reliable, fast performance with easy integration, a user-friendly dashboard, and enterprise-grade scaling. Powered by ethically-sourced residential IPs for seamless web scraping.
    Get Started
  • Top-Rated Free CRM Software Icon
    Top-Rated Free CRM Software

    216,000+ customers in over 135 countries grow their businesses with HubSpot

    HubSpot is an AI-powered customer platform with all the software, integrations, and resources you need to connect your marketing, sales, and customer service. HubSpot's connected platform enables you to grow your business faster by focusing on what matters most: your customers.
    Get started free
  • 1
    Transformers

    Transformers

    State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX

    Transformers provides APIs and tools to easily download and train state-of-the-art pre-trained models. Using pre-trained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch. These models support common tasks in different modalities. Text, for tasks like text classification, information extraction, question answering, summarization, translation, text generation, in over 100 languages. Images, for tasks like image...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    x-transformers

    x-transformers

    A simple but complete full-attention transformer

    A simple but complete full-attention transformer with a set of promising experimental features from various papers. Proposes adding learned memory key/values prior to attending. They were able to remove feedforwards altogether and attain a similar performance to the original transformers. I have found that keeping the feedforwards and adding the memory key/values leads to even better performance. Proposes adding learned tokens, akin to CLS tokens, named memory tokens, that is passed through...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 3
    Curated Transformers

    Curated Transformers

    PyTorch library of curated Transformer models and their components

    State-of-the-art transformers, brick by brick. Curated Transformers is a transformer library for PyTorch. It provides state-of-the-art models that are composed of a set of reusable components. Supports state-of-the-art transformer models, including LLMs such as Falcon, Llama, and Dolly v2. Implementing a feature or bugfix benefits all models. For example, all models support 4/8-bit inference through the bitsandbytes library and each model can use the PyTorch meta device to avoid unnecessary...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    spacy-transformers

    spacy-transformers

    Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

    ... trained transformer model if you install spacy-transformers. You can also do your own language model pretraining via the spacy pre train command. You can even share your transformer or another contextual embedding model across multiple components, which can make long pipelines several times more efficient. To use transfer learning, you’ll need at least a few annotated examples for what you’re trying to predict.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Save hundreds of developer hours with components built for SaaS applications. Icon
    Save hundreds of developer hours with components built for SaaS applications.

    The #1 Embedded Analytics Solution for SaaS Teams.

    Whether you want full self-service analytics or simpler multi-tenant security, Qrvey’s embeddable components and scalable data management remove the guess work.
    Try Developer Playground
  • 5
    Tensorflow Transformers

    Tensorflow Transformers

    State of the art faster Transformer with Tensorflow 2.0

    Imagine auto-regressive generation to be 90x faster. tf-transformers (Tensorflow Transformers) is designed to harness the full power of Tensorflow 2, designed specifically for Transformer based architecture. These models can be applied on text, for tasks like text classification, information extraction, question answering, summarization, translation, text generation, in over 100 languages. Images, for tasks like image classification, object detection, and segmentation. Audio, for tasks like...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Babel

    Babel

    The compiler for writing next generation JavaScript

    Babel is a toolchain that helps you write code in the latest version of JavaScript. It converts ECMAScript 2015+ code into a backwards compatible version of JavaScript that can be run by older JavaScript engines. With Babel you can transform syntax, polyfill features that are missing in your target environment, transform source code and more!
    Downloads: 9 This Week
    Last Update:
    See Project
  • 7
    GeneralAI

    GeneralAI

    Large-scale Self-supervised Pre-training Across Tasks, Languages, etc.

    Fundamental research to develop new architectures for foundation models and AI, focusing on modeling generality and capability, as well as training stability and efficiency.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 8
    GPT-NeoX

    GPT-NeoX

    Implementation of model parallel autoregressive transformers on GPUs

    ... recommend Mesh Transformer JAX. If you are not looking to train models with billions of parameters from scratch, this is likely the wrong library to use. For generic inference needs, we recommend you use the Hugging Face transformers library instead which supports GPT-NeoX models.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 9
    The SpeechBrain Toolkit

    The SpeechBrain Toolkit

    A PyTorch-based Speech Toolkit

    SpeechBrain is an open-source and all-in-one conversational AI toolkit. It is designed to be simple, extremely flexible, and user-friendly. Competitive or state-of-the-art performance is obtained in various domains. SpeechBrain supports state-of-the-art methods for end-to-end speech recognition, including models based on CTC, CTC+attention, transducers, transformers, and neural language models relying on recurrent neural networks and transformers. Speaker recognition is already deployed...
    Downloads: 2 This Week
    Last Update:
    See Project
  • Enable access to virtual apps and desktops without device or browser restrictions. Icon
    Enable access to virtual apps and desktops without device or browser restrictions.

    For IT Professionals and Application Developers

    Parallels® RAS (remote application server) is a flexible virtual application and desktop delivery solution that empowers organizations of all sizes to work securely from anywhere, on any device.
    Learn More
  • 10
    bert4torch

    bert4torch

    An elegent pytorch implement of transformers

    An elegant PyTorch implement of transformers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    Prime QA

    Prime QA

    State-of-the-art Multilingual Question Answering research

    PrimeQA is a public open source repository that enables researchers and developers to train state-of-the-art models for question answering (QA). By using PrimeQA, a researcher can replicate the experiments outlined in a paper published in the latest NLP conference while also enjoying the capability to download pre-trained models (from an online repository) and run them on their own custom data. PrimeQA is built on top of the Transformers toolkit and uses datasets and models that are directly...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 12
    pmdarima

    pmdarima

    Statistical library designed to fill the void in Python's time series

    A statistical library designed to fill the void in Python's time series analysis capabilities, including the equivalent of R's auto.arima function.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 13
    SageMaker Hugging Face Inference Toolkit

    SageMaker Hugging Face Inference Toolkit

    Library for serving Transformers models on Amazon SageMaker

    SageMaker Hugging Face Inference Toolkit is an open-source library for serving Transformers models on Amazon SageMaker. This library provides default pre-processing, predict and postprocessing for certain Transformers models and tasks. It utilizes the SageMaker Inference Toolkit for starting up the model server, which is responsible for handling inference requests. For the Dockerfiles used for building SageMaker Hugging Face Containers, see AWS Deep Learning Containers. The SageMaker Hugging...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Flower

    Flower

    Flower: A Friendly Federated Learning Framework

    ...-of-the-art systems. Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, Hugging Face Transformers, PyTorch Lightning, scikit-learn, JAX, TFLite, MONAI, fastai, MLX, XGBoost, Pandas for federated analytics, or even raw NumPy for users who enjoy computing gradients by hand.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 15
    BERTopic

    BERTopic

    Leveraging BERT and c-TF-IDF to create easily interpretable topics

    BERTopic is a topic modeling technique that leverages transformers and c-TF-IDF to create dense clusters allowing for easily interpretable topics whilst keeping important words in the topic descriptions. BERTopic supports guided, supervised, semi-supervised, manual, long-document, hierarchical, class-based, dynamic, and online topic modeling. It even supports visualizations similar to LDAvis! Corresponding medium posts can be found here, here and here. For a more detailed overview, you can read...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    ktrain

    ktrain

    ktrain is a Python library that makes deep learning AI more accessible

    ... of code, ktrain allows you to easily and quickly. ktrain purposely pins to a lower version of transformers to include support for older versions of TensorFlow. If you need a newer version of transformers, it is usually safe for you to upgrade transformers, as long as you do it after installing ktrain. As of v0.30.x, TensorFlow installation is optional and only required if training neural networks.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    Nixtla Neural Forecast

    Nixtla Neural Forecast

    Scalable and user friendly neural forecasting algorithms.

    NeuralForecast offers a large collection of neural forecasting models focusing on their performance, usability, and robustness. The models range from classic networks like RNNs to the latest transformers: MLP, LSTM, GRU, RNN, TCN, TimesNet, BiTCN, DeepAR, NBEATS, NBEATSx, NHITS, TiDE, DeepNPTS, TSMixer, TSMixerx, MLPMultivariate, DLinear, NLinear, TFT, Informer, AutoFormer, FedFormer, PatchTST, iTransformer, StemGNN, and TimeLLM. There is a shared belief in Neural forecasting methods' capacity...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 18
    Hyperformer

    Hyperformer

    Hypergraph Transformer for Skeleton-based Action Recognition

    ... is identified, i.e., the topology is fixed after training. To relax such a restriction, Self-Attention (SA) mechanism has been adopted to make the topology of GCNs adaptive to the input, resulting in the state-of-the-art hybrid models. Concurrently, attempts with plain Transformers have also been made, but they still lag behind state-of-the-art GCN-based methods due to the lack of structural prior.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 19
    Tokenizers

    Tokenizers

    Fast State-of-the-Art Tokenizers optimized for Research and Production

    Fast State-of-the-art tokenizers, optimized for both research and production. Tokenizers provides an implementation of today’s most used tokenizers, with a focus on performance and versatility. These tokenizers are also used in Transformers. Train new vocabularies and tokenize, using today’s most used tokenizers. Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server’s CPU. Easy to use, but also...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 20
    Mergo

    Mergo

    Mergo: merging Go structs and maps since 2013

    ... that in 0.3.2, Mergo changed Merge()and Map() signatures to support transformers. I added an optional/variadic argument so that it won't break the existing code. You can only merge same-type structs with exported fields initialized as zero value of their type and same-types maps. Mergo won't merge unexported (private) fields but will do recursively any exported one. It won't merge empty structs value as they are zero values too.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 21
    Primus

    Primus

    An abstraction layer for real-time to prevent module lock-in

    Primus, the creator god of transformers but now also known as universal wrapper for real-time frameworks. There are a lot of real-time frameworks available for Node.js and they all have different opinions on how real-time should be done. Primus provides a common low level interface to communicate in real-time using various real-time frameworks. Effortless switching between real-time frameworks by changing one single line of code. No more API rewrites needed when your project requirements change...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 22
    LLaMA Efficient Tuning

    LLaMA Efficient Tuning

    Easy-to-use LLM fine-tuning framework (LLaMA-2, BLOOM, Falcon

    Easy-to-use LLM fine-tuning framework (LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, ChatGLM2)
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    Spark NLP

    Spark NLP

    State of the Art Natural Language Processing

    ... components, estimators and transformers. The estimators have a method that secures and trains a piece of data to such an application. The transformer is generally the result of a fitting process and applies changes to the target dataset. These components have been embedded to be applicable to Spark NLP. Pipelines are a mechanism for combining multiple estimators and transformers in a single workflow. They allow multiple chained transformations along a machine-learning task.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 24
    Backtrack Sampler

    Backtrack Sampler

    An easy-to-understand framework for LLM samplers

    Backtrack Sampler is a framework designed for experimenting with custom sampling strategies for language models (LLMs), enabling the ability to rewind and revise generated tokens. It allows developers to create and test their own token generation strategies by providing a base structure for manipulating logits and probabilities, making it a flexible tool for those interested in fine-tuning the behavior of LLMs.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    Vitest

    Vitest

    Next generation testing framework powered by Vite

    Next-generation testing framework powered by Vite. Reuse Vite's config and plugins - consistent across your app and tests. But Vitest is not required. Expect, snapshot, coverage, and more - migrating from Jest is straightforward. Out-of-box ESM, TypeScript and JSX support powered by esbuild.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • 4
  • Next