Showing 125 open source projects for "linux server"

View related business solutions
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Cloud tools for web scraping and data extraction Icon
    Cloud tools for web scraping and data extraction

    Deploy pre-built tools that crawl websites, extract structured data, and feed your applications. Reliable web data without maintaining scrapers.

    Automate web data collection with cloud tools that handle anti-bot measures, browser rendering, and data transformation out of the box. Extract content from any website, push to vector databases for RAG workflows, or pipe directly into your apps via API. Schedule runs, set up webhooks, and connect to your existing stack. Free tier available, then scale as you need to.
    Explore 10,000+ tools
  • 1
    MCP Server OpenDAL

    MCP Server OpenDAL

    Model Context Protocol Server for Apache OpenDAL™

    Model Context Protocol Server for Apache OpenDAL™ is an MCP server implementation that provides access to various storage services via Apache OpenDAL. It enables seamless interactions with multiple storage backends through a unified interface. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    MCP Server Qdrant

    MCP Server Qdrant

    An official Qdrant Model Context Protocol (MCP) server implementation

    The Qdrant MCP Server is an official Model Context Protocol server that integrates with the Qdrant vector search engine. It acts as a semantic memory layer, allowing for the storage and retrieval of vector-based data, enhancing the capabilities of AI applications requiring semantic search functionalities. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    MCP Shell Server

    MCP Shell Server

    Shell command execution server implementing the Model Context Protocol

    A secure shell command execution server implementing the Model Context Protocol (MCP), allowing remote execution of whitelisted shell commands with support for standard input. ​
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4
    VikingDB MCP Server

    VikingDB MCP Server

    A mcp server for vikingdb store and search

    An MCP server that interfaces with VikingDB, a high-performance vector database developed by ByteDance, enabling efficient vector storage and search capabilities. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • MongoDB Atlas runs apps anywhere Icon
    MongoDB Atlas runs apps anywhere

    Deploy in 115+ regions with the modern database for every enterprise.

    MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
    Learn More
  • 5
    Elasticsearch MCP Server

    Elasticsearch MCP Server

    A Model Context Protocol (MCP) server implementation

    This MCP server implementation provides interaction capabilities with Elasticsearch and OpenSearch, enabling functionalities such as document searching, index analysis, and cluster management through a set of tools. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    MCP Snowflake Server

    MCP Snowflake Server

    A Model Context Protocol (MCP) server implementation

    An MCP server implementation that facilitates database interactions with Snowflake, allowing execution of SQL queries and presentation of data insights as resources. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Supabase MCP Server

    Supabase MCP Server

    Query MCP enables end-to-end management of Supabase via chat interface

    An open-source MCP server that enables comprehensive management of Supabase projects through natural language interactions, providing capabilities such as SQL execution, schema management, and API integration. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    MCP Teams Server

    MCP Teams Server

    An MCP (Model Context Protocol) server implementation

    An MCP server implementation for Microsoft Teams integration, providing capabilities to read messages, create messages, reply to messages, and mention members, facilitating AI-driven interactions within Teams. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    Xiyan MCP Server

    Xiyan MCP Server

    A Model Context Protocol (MCP) server

    The XiYan MCP Server is a Model Context Protocol (MCP) server that enables natural language queries to databases, powered by XiYan-SQL, a state-of-the-art text-to-SQL model. It allows users to interact with databases using conversational language, simplifying data retrieval processes. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • Create and run cloud-based virtual machines. Icon
    Create and run cloud-based virtual machines.

    Secure and customizable compute service that lets you create and run virtual machines.

    Computing infrastructure in predefined or custom machine sizes to accelerate your cloud transformation. General purpose (E2, N1, N2, N2D) machines provide a good balance of price and performance. Compute optimized (C2) machines offer high-end vCPU performance for compute-intensive workloads. Memory optimized (M2) machines offer the highest memory and are great for in-memory databases. Accelerator optimized (A2) machines are based on the A100 GPU, for very demanding applications.
    Try for free
  • 10
    ArXiv MCP Server

    ArXiv MCP Server

    A Model Context Protocol server for searching and analyzing arXiv

    arxiv-mcp-server bridges AI assistants and the arXiv repository through a clean MCP interface, enabling search, metadata retrieval, and content access without bespoke scraping. With simple tools like “search” and “fetch,” an agent can find papers, pull abstracts, and download PDFs for downstream summarization or analysis. The project includes packaging and CI to publish to PyPI, plus tests and linting for reliability. Issue threads show feature requests such as extracting embedded LaTeX and...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    MCP Server Home Assistant

    MCP Server Home Assistant

    A Model Context Protocol Server for Home Assistant

    The Home Assistant MCP Server is an MCP server that integrates with Home Assistant, enabling AI assistants to interact with smart home devices and systems. It exposes Home Assistant voice intents through the Model Context Protocol for enhanced home control. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Browser Use MCP Server

    Browser Use MCP Server

    Browse the web, directly from Cursor etc.

    A browser automation server implementing the Model Context Protocol, designed to allow AI assistants to browse the web directly from applications like Cursor. It supports natural language commands for web navigation and interaction. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    web-eval-agent MCP Server

    web-eval-agent MCP Server

    An MCP server that autonomously evaluates web applications

    web-eval-agent is a Model Context Protocol (MCP) server that spins up a browser-use–capable debugging agent to autonomously run and evaluate web apps straight from your editor. It’s positioned as a “let the coding agent debug itself” companion: the agent launches the app, navigates flows, captures evidence, and iterates on failures without manual copy-pasting of logs. The repository focuses on developer ergonomics, exposing typed MCP tools so clients like Claude Desktop can start sessions,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    Logfire MCP

    Logfire MCP

    The Logfire MCP Server is here

    The Logfire MCP Server is a Model Context Protocol server that allows AI applications to access OpenTelemetry traces and metrics sent to Logfire. It enables retrieval and analysis of telemetry data, enhancing debugging and observability workflows. ​
    Downloads: 2 This Week
    Last Update:
    See Project
  • 15
    IDA Pro MCP

    IDA Pro MCP

    MCP Server for IDA Pro

    The IDA Pro MCP Server is a Model Context Protocol (MCP) server designed to integrate with IDA Pro, a popular disassembler and debugger. It enables AI assistants to interact with IDA Pro, facilitating tasks such as code analysis and reverse engineering. ​
    Downloads: 3 This Week
    Last Update:
    See Project
  • 16
    MCP ZoomEye

    MCP ZoomEye

    A Model Context Protocol server that provides network asset info

    The ZoomEye MCP Server is a Model Context Protocol server that provides network asset information based on query conditions, allowing Large Language Models to obtain data by querying ZoomEye using dorks and other search parameters. ​
    Downloads: 1 This Week
    Last Update:
    See Project
  • 17
    WhisperLive

    WhisperLive

    A nearly-live implementation of OpenAI's Whisper

    WhisperLive is a “nearly live” implementation of OpenAI’s Whisper model focused on real-time transcription. It runs as a server–client system in which the server hosts a Whisper backend and clients stream audio to be transcribed with very low delay. The project supports multiple inference backends, including Faster-Whisper, NVIDIA TensorRT, and OpenVINO, allowing you to target GPUs and different CPU architectures efficiently. It can handle microphone input, pre-recorded audio files, and...
    Downloads: 6 This Week
    Last Update:
    See Project
  • 18
    HexStrike AI MCP Agents

    HexStrike AI MCP Agents

    HexStrike AI MCP Agents is an advanced MCP server

    HexStrike AI is an MCP server that lets LLM agents autonomously operate a large catalog of offensive-security tools. Its goal is to bridge “language models” and practical pentest workflows—enumeration, exploitation, vulnerability discovery, and bug bounty reconnaissance—under safe, auditable controls. The server exposes typed tools and guardrails so agent prompts translate to concrete, parameterized actions rather than brittle shell strings. It ships with curated tool adapters, task...
    Downloads: 7 This Week
    Last Update:
    See Project
  • 19
    MCP Atlassian

    MCP Atlassian

    MCP server that integrates Confluence and Jira

    The MCP Atlassian server integrates Atlassian products like Confluence and Jira with the Model Context Protocol. It supports both Cloud and Server/Data Center deployments, enabling AI models to interact with these platforms securely. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    LazyLLM

    LazyLLM

    Easiest and laziest way for building multi-agent LLMs applications

    LazyLLM is an optimized, lightweight LLM server designed for easy and fast deployment of large language models. It is fully compatible with the OpenAI API specification, enabling developers to integrate their own models into applications that normally rely on OpenAI’s endpoints. LazyLLM emphasizes low resource usage and fast inference while supporting multiple models.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 21
    MCP Everything Search

    MCP Everything Search

    An MCP server that provides fast file searching capabilities

    Everything Search MCP Server is an MCP server that provides fast file searching capabilities across Windows, macOS, and Linux. On Windows, it utilizes the Everything SDK; on macOS, it leverages the built-in mdfind command; and on Linux, it uses the locate or plocate command. ​
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    LoRAX

    LoRAX

    Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs

    Lorax is a multi-LoRA (Low-Rank Adaptation) inference server that scales to thousands of fine-tuned Large Language Models (LLMs). It enables efficient deployment and management of numerous fine-tuned models, facilitating scalable AI applications. Lorax is designed to handle high concurrency and provides a robust infrastructure for serving multiple LLMs simultaneously.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    Lemonade

    Lemonade

    Lemonade helps users run local LLMs with the highest performance

    Lemonade is a local LLM runtime that aims to deliver the highest possible performance on your own hardware by auto-configuring state-of-the-art inference engines for both NPUs and GPUs. The project positions itself as a “local LLM server” you can run on laptops and workstations, abstracting away backend differences while giving you a single place to serve and manage models. Its README emphasizes real-world adoption across startups, research groups, and large companies, signaling a focus on...
    Downloads: 4 This Week
    Last Update:
    See Project
  • 24
    mcpo

    mcpo

    A simple, secure MCP-to-OpenAPI proxy server

    mcpo is a minimal bridge that exposes any MCP tool as an OpenAPI-compatible HTTP server. Instead of writing glue code, you point mcpo at an MCP server command and it generates REST endpoints and an OpenAPI spec that other systems (or LLM agent frameworks) can call immediately. This design lets you reuse a growing library of MCP servers with platforms that only understand HTTP+OpenAPI, unifying tool access across ecosystems. The project emphasizes “dead-simple” setup and pairs with Open WebUI...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    FastKoko

    FastKoko

    Dockerized FastAPI wrapper for Kokoro-82M text-to-speech model

    FastKoko is a self-hosted text-to-speech server built around the Kokoro-82M model and exposed through a FastAPI backend. It is designed to be easy to deploy via Docker, with separate CPU and GPU images so that users can choose between pure CPU inference and NVIDIA GPU acceleration. The project exposes an OpenAI-compatible speech endpoint, which means existing code that talks to the OpenAI audio API can often be pointed at a Kokoro-FastAPI instance with minimal changes. It supports multiple...
    Downloads: 3 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • 3
  • 4
  • 5
  • Next