Rust LLM Inference Tools

View 131 business solutions

Browse free open source Rust LLM Inference Tools and projects below. Use the toggles on the left to filter open source Rust LLM Inference Tools by OS, license, language, programming language, and project status.

  • Auth0 for AI Agents now in GA Icon
    Auth0 for AI Agents now in GA

    Ready to implement AI with confidence (without sacrificing security)?

    Connect your AI agents to apps and data more securely, give users control over the actions AI agents can perform and the data they can access, and enable human confirmation for critical agent actions.
    Start building today
  • Stay in Flow. Let Zenflow Handle the Heavy Lifting. Icon
    Stay in Flow. Let Zenflow Handle the Heavy Lifting.

    Your AI engineering control center. Zenflow turns specs into shipped features using parallel agents and multi-repo intelligence.

    Zenflow is your engineering control center, turning specs into shipped features. Parallel agents handle coding, testing, and refactoring with real repo context. Multi-agent workflows remove bottlenecks and automate routine work so developers stay focused and in flow.
    Try free now
  • 1
    Arch Gateway

    Arch Gateway

    The AI-native (edge and LLM) proxy for agents

    Arch is an AI-native proxy designed to facilitate the development of agentic applications by handling complex tasks such as input clarification, agent routing, and seamless integration of prompts with tools for common tasks. It provides unified access and observability of Large Language Models (LLMs), enabling developers to build applications more efficiently. Arch supports both edge and LLM deployments, offering flexibility in various environments.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    Artificial Intelligence Controller

    Artificial Intelligence Controller

    AICI: Prompts as (Wasm) Programs

    AICI is a framework that allows developers to build controllers that constrain and direct the output of Large Language Models (LLMs). By treating prompts as WebAssembly (Wasm) programs, AICI enables more precise and controlled interactions with LLMs, enhancing their utility in various applications. This approach allows for the creation of more reliable and predictable AI-driven systems.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    pipeless

    pipeless

    A computer vision framework to create and deploy apps in minutes

    Pipeless is an open-source computer vision framework to create and deploy applications without the complexity of building and maintaining multimedia pipelines. It ships everything you need to create and deploy efficient computer vision applications that work in real-time in just minutes. Pipeless is inspired by modern serverless technologies. It provides the development experience of serverless frameworks applied to computer vision. You provide some functions that are executed for new video frames and Pipeless takes care of everything else. You can easily use industry-standard models, such as YOLO, or load your custom model in one of the supported inference runtimes. Pipeless ships some of the most popular inference runtimes, such as the ONNX Runtime, allowing you to run inference with high performance on CPU or GPU out-of-the-box. You can deploy your Pipeless application with a single command to edge and IoT devices or the cloud.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next