Showing 7 open source projects for "queue"

View related business solutions
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Lightspeed golf course management software Icon
    Lightspeed golf course management software

    Lightspeed Golf is all-in-one golf course management software to help courses simplify operations, drive revenue and deliver amazing golf experiences.

    From tee sheet management, point of sale and payment processing to marketing, automation, reporting and more—Lightspeed is built for the pro shop, restaurant, back office, beverage cart and beyond.
    Learn More
  • 1
    memphis

    memphis

    Next-Generation Event Processing Platform

    Memphis enables building modern queue-based applications that require large volumes of streamed and enriched data, modern protocols, zero ops, up to x9 faster development, up to x46 fewer costs, and significantly lower dev time for data-oriented developers and data engineers. Queues and brokers are a mission-critical component in the modern application architecture and should be highly available and stable as possible.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    Apache RocketMQ

    Apache RocketMQ

    Distributed messaging and streaming platform with low latency

    ...Built-in message tracing capability, also support opentracing. Versatile big-data and streaming ecosytem integration. Message retroactivity by time or offset. Reliable FIFO and strict ordered messaging in the same queue. Efficient pull and push consumption model. Million-level message accumulation capacity in a single queue. Multiple messaging protocols like JMS and OpenMessaging. Flexible distributed scale-out deployment architecture. Lightning-fast batch message exchange system.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 3
    Logstash

    Logstash

    Centralize, transform and stash your data

    Logstash is a server-side data processing pipeline that dynamically ingests data from numerous sources, transforms it, and ships it to your favorite “stash” regardless of format or complexity. It supports and ingests data of all shapes, sizes and sources, dynamically transforms and prepares this data, and transports it to the output of your choice. Logstash is extensible, with over 200 plugins available to let you create and configure your pipeline how you choose.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4

    PDP-OmniSim

    PDP-OmniSim simulating parallel and distributed processing systems

    PDP-OmniSim 🧬 Scientific Overview PDP-OmniSim is an advanced computational framework for simulating parallel and distributed processing systems, with cutting-edge applications in computational neuroscience, distributed computing, and complex systems modeling. The framework provides researchers with robust tools for large-scale simulations of networked systems and their emergent behaviors. 🎯 Key Scientific Contributions 🔬 Interdisciplinary Research Domains Computational...
    Downloads: 0 This Week
    Last Update:
    See Project
  • AI-First Supply Chain Management Icon
    AI-First Supply Chain Management

    Supply chain managers, executives, and businesses seeking AI-powered solutions to optimize planning, operations, and decision-making across the supply

    Logility is a market-leading provider of AI-first supply chain management solutions engineered to help organizations build sustainable digital supply chains that improve people’s lives and the world we live in. The company’s approach is designed to reimagine supply chain planning by shifting away from traditional “what happened” processes to an AI-driven strategy that combines the power of humans and machines to predict and be ready for what’s coming. Logility’s fully integrated, end-to-end platform helps clients know faster, turn uncertainty into opportunity, and transform the supply chain from a cost center to an engine for growth.
    Learn More
  • 5
    BDS

    BDS

    Blockchain data parsing and persisting results

    ...Splitter is the key module of Blockchain Data Service (BDS) and provides data analysis capability. Splitter is responsible for consuming blockchain data from message queue (kafka) and inserting data into persistent data storage services (relational database, data warehouse, etc.) for further processing. Before compiling and running BDS, you must install go's compilation environment locally.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6

    PerformNet

    Queue network modelization tool

    Ceci est le projet réalisé à la session d’hiver 2014 par l’équipe Expresso. L’équipe était composée de : Philippe Olivier, François Lemonnier-Lalonde, Dominique Tremblay et François Moreau. Le projet a été réalisé pour le cours de Génie logiciel orienté objet, donné par M. Jonathan Gaudreault à l'Université Laval. Le projet consistait à développer un logiciel de modélisation de réseau de files d’attente, aussi appelé un réseau de Jackson. Chaque nœud du réspeau représente une station...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    q pipeline manager

    q pipeline manager

    q: integrated platform for pipeline configuration and management

    ...It expands the value of your existing job scheduler - either Grid Engine or TORQUE PBS - through numerous functions that help you organize, submit, monitor, manage and share your informatics work. Data processing pipelines require high-level organization and parallelization of work to optimize resource utilization and decrease the time to results. q (from queue) allows complex job sequences to be efficiently assembled and managed, including dependency tracking, parallelization, and pipeline-level monitoring, error recovery, and data protection. Pipelines are constructed from modular script files, with job definitions and results stored in easily retrieved job files. A web interface facilitates job submission and monitoring, with the complete pipeline exportable for full transparency.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next