Run Local LLMs on Any Device. Open-source
Single-cell analysis in Python
DoWhy is a Python library for causal inference
Standardized Serverless ML Inference Platform on Kubernetes
Python Package for ML-Based Heterogeneous Treatment Effects Estimation
Openai style api for open large language models
Large Language Model Text Generation Inference
Tensor search for humans
Powering Amazon custom machine learning chips
Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT method
Open platform for training, serving, and evaluating language models
Uplift modeling and causal inference with machine learning algorithms
Superduper: Integrate AI models and machine learning workflows
The unofficial python package that returns response of Google Bard
Trainable models and NN optimization tools
Create HTML profiling reports from pandas DataFrame objects
Multi-Modal Neural Networks for Semantic Search, based on Mid-Fusion
High quality, fast, modular reference implementation of SSD in PyTorch
Implementation of model parallel autoregressive transformers on GPUs
Sequence-to-sequence framework, focused on Neural Machine Translation
Training & Implementation of chatbots leveraging GPT-like architecture
CPU/GPU inference server for Hugging Face transformer models