A coding-free framework built on PyTorch
DoWhy is a Python library for causal inference
JAX-based neural network library
Model Context Protocol server that integrates AgentQL's data
Recognition and resolution of numbers, units, date/time, etc.
A Python package for extending the official PyTorch
Gen-AI Chat for Teams
A Model Context Protocol server that provides network asset info
Query MCP enables end-to-end management of Supabase via chat interface
Connect any Open Data to any LLM with Model Context Protocol
K8s-mcp-server is a Model Context Protocol (MCP) server
FlashInfer: Kernel Library for LLM Serving
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
World of apps for benchmarking interactive coding agent
Neural Network Compression Framework for enhanced OpenVINO
A Unified Library for Parameter-Efficient Learning
ChatArena (or Chat Arena) is a Multi-Agent Language Game Environments
A very simple framework for state-of-the-art NLP
Investment Research for Everyone, Everywhere
PyTorch library of curated Transformer models and their components
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere
The unofficial python package that returns response of Google Bard
Interpretable prompting and models for NLP
Neural Search
Data loaders and abstractions for text and NLP