Training and deploying machine learning models on Amazon SageMaker
Run Local LLMs on Any Device. Open-source
Single-cell analysis in Python
Port of Facebook's LLaMA model in C/C++
AI interface for tinkerers (Ollama, Haystack RAG, Python)
A high-throughput and memory-efficient inference and serving engine
A Pythonic framework to simplify AI service building
Python Package for ML-Based Heterogeneous Treatment Effects Estimation
The official Python client for the Huggingface Hub
DoWhy is a Python library for causal inference
Uplift modeling and causal inference with machine learning algorithms
Everything you need to build state-of-the-art foundation models
A high-performance ML model serving framework, offers dynamic batching
Operating LLMs in production
C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4(V)
Gaussian processes in TensorFlow
Efficient few-shot learning with Sentence Transformers
The unofficial python package that returns response of Google Bard
Integrate, train and manage any AI models and APIs with your database
Large Language Model Text Generation Inference
An easy-to-use LLMs quantization package with user-friendly apis
Pytorch domain library for recommendation systems
Official inference library for Mistral models
Neural Network Compression Framework for enhanced OpenVINO
Superduper: Integrate AI models and machine learning workflows