Jittor is a high-performance deep learning framework
.NET Standard bindings for Google's TensorFlow for developing models
The open source federated learning for vehicular network simulation
Open-source framework for conversational voice AI agents
The Triton Inference Server provides an optimized cloud
Core ML tools contain supporting tools for Core ML model conversion
A game theoretic approach to explain the output of ml models
MiniSom is a minimalistic implementation of the Self Organizing Maps
C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4(V)
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
ONNX-TensorRT: TensorRT backend for ONNX
Ongoing research training transformer models at scale
Open deep learning compiler stack for cpu, gpu, etc.
High-performance library for gradient boosting on decision trees
mlpack: a scalable C++ machine learning library
Enabling PyTorch on Google TPU
C++ DataFrame for statistical, Financial, and ML analysis
Stable Diffusion with Core ML on Apple Silicon
Stanford NLP Python library for many human languages
Python bindings for the Transformer models implemented in C/C++
Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
ChatGPT extension for scientific research work
Data loaders and abstractions for text and NLP
C++-based high-performance parallel environment execution engine
Recognition and resolution of numbers, units, date/time, etc.