State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX
A simple but complete full-attention transformer
PyTorch library of curated Transformer models and their components
Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
State of the art faster Transformer with Tensorflow 2.0
The compiler for writing next generation JavaScript
Large-scale Self-supervised Pre-training Across Tasks, Languages, etc.
Implementation of model parallel autoregressive transformers on GPUs
A PyTorch-based Speech Toolkit
An elegent pytorch implement of transformers
State-of-the-art Multilingual Question Answering research
Statistical library designed to fill the void in Python's time series
Library for serving Transformers models on Amazon SageMaker
Flower: A Friendly Federated Learning Framework
Leveraging BERT and c-TF-IDF to create easily interpretable topics
ktrain is a Python library that makes deep learning AI more accessible
Scalable and user friendly neural forecasting algorithms.
Hypergraph Transformer for Skeleton-based Action Recognition
Fast State-of-the-Art Tokenizers optimized for Research and Production
Mergo: merging Go structs and maps since 2013
An abstraction layer for real-time to prevent module lock-in
Easy-to-use LLM fine-tuning framework (LLaMA-2, BLOOM, Falcon
State of the Art Natural Language Processing
An easy-to-understand framework for LLM samplers
Next generation testing framework powered by Vite