Run Local LLMs on Any Device. Open-source
Gaussian processes in TensorFlow
Phi-3.5 for Mac: Locally-run Vision and Language Models
Trainable models and NN optimization tools
Powering Amazon custom machine learning chips
Simplifies the local serving of AI models from any source
Unified Model Serving Framework
Images to inference with no labeling
Database system for building simpler and faster AI-powered application
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere
Sequence-to-sequence framework, focused on Neural Machine Translation
Deploy a ML inference service on a budget in 10 lines of code