Best Data Management Software for Apache Kafka - Page 6

Compare the Top Data Management Software that integrates with Apache Kafka as of September 2025 - Page 6

This a list of Data Management software that integrates with Apache Kafka. Use the filters on the left to add additional filters for products that have integrations with Apache Kafka. View the products that work with Apache Kafka in the table below.

  • 1
    Yandex Managed Service for Apache Kafka
    Focus on developing data stream processing applications and don’t waste time maintaining the infrastructure. Managed Service for Apache Kafka is responsible for managing Zookeeper brokers and clusters, configuring clusters, and updating their versions. Distribute your cluster brokers across different availability zones and set the replication factor to ensure the desired level of fault tolerance. The service analyzes the metrics and status of the cluster and automatically replaces it if one of the nodes fails. For each topic, you can set the replication factor, log cleanup policy, compression type, and maximum number of messages to make better use of computing, network, and disk resources. You can add brokers to your cluster with just a click of a button to improve its performance, or change the class of high-availability hosts without stopping them or losing any data.
  • 2
    Arroyo

    Arroyo

    Arroyo

    Scale from zero to millions of events per second. Arroyo ships as a single, compact binary. Run locally on MacOS or Linux for development, and deploy to production with Docker or Kubernetes. Arroyo is a new kind of stream processing engine, built from the ground up to make real-time easier than batch. Arroyo was designed from the start so that anyone with SQL experience can build reliable, efficient, and correct streaming pipelines. Data scientists and engineers can build end-to-end real-time applications, models, and dashboards, without a separate team of streaming experts. Transform, filter, aggregate, and join data streams by writing SQL, with sub-second results. Your streaming pipelines shouldn't page someone just because Kubernetes decided to reschedule your pods. Arroyo is built to run in modern, elastic cloud environments, from simple container runtimes like Fargate to large, distributed deployments on the Kubernetes logo Kubernetes.
  • 3
    InformationGrid

    InformationGrid

    InformationGrid

    Most businesses are aware that data can help them to understand their customers better to improve processes and solve problems. We have developed a safe solution for sharing and aggregating data: InformationGrid. Get valuable insights with InformationGrid, a safe solution for sharing and aggregating data. A solution that allows hospitals and general practitioners to share data in order to prescribe better medications. Whether you have a clear idea of your data strategy or need some guidance defining this; we’re here to help you put your data strategy into action. Our SAAS platform enables you to share and aggregate data in a way that’s secure and cost-efficient. To keep time to value as short as possible we can build an application that can handle large amounts of data, and we can build it fast by using cloud-native techniques. This enables you to reap the short-term benefits of the data and get ready to start using AI to advance your business.
  • 4
    Stratio

    Stratio

    Stratio

    A unified secure business data layer providing instant answers for business and data teams. Stratio generative AI data fabric covers the whole lifecycle of data management from data discovery, and governance, to use and disposal. Your organization has data all over the place, different divisions use different apps to do different things. Stratio uses AI to find all your data, whether it's on-prem or in the cloud. That means you can be sure that you're treating data appropriately. If you can't see your data as soon as its generated, you´ll never move as fast as your customers. With most data infrastructure, it can take hours to process customer data. Stratio accesses 100% of your data in real-time without moving it, so you can act quickly without losing the all-important context. Only by unifying the operational and informational in a collaborative platform companies will be able to move to instant extended AI.
  • 5
    Axual

    Axual

    Axual

    Axual is Kafka-as-a-Service for DevOps teams. Empower your team to unlock insights and drive decisions with our intuitive Kafka platform. Axual offers the ultimate solution for enterprises looking to seamlessly integrate data streaming into their core IT infrastructure. Our all-in-one Kafka platform is designed to eliminate the need for extensive technical knowledge or skills, and provides a ready-made solution that delivers all the benefits of event streaming without the hassle. The Axual Platform is a all-in-one solution, designed to help you simplify and enhance the deployment, management, and utilization of real-time data streaming with Apache Kafka. By providing an array of features that cater to the diverse needs of modern enterprises, the Axual Platform enables organizations to harness the full potential of data streaming while minimizing complexity and operational overhead.
  • 6
    Gable

    Gable

    Gable

    Data contracts facilitate communication between data teams and developers. Don’t just detect problematic changes, prevent them at the application level. Detect every change, from every data source using AI-based asset registration. Drive the adoption of data initiatives with upstream visibility and impact analysis. Shift left both data ownership and management through data governance as code and data contracts. Build data trust through the timely communication of data quality expectations and changes. Eliminate data issues at the source by seamlessly integrating our AI-driven technology. Everything you need to make your data initiative a success. Gable is a B2B data infrastructure SaaS that provides a collaboration platform to author and enforce data contracts. ‘Data contracts’, refer to API-based agreements between the software engineers who own upstream data sources and data engineers/analysts that consume data to build machine learning models and analytics.
  • 7
    Salesforce Data Cloud
    Salesforce Data Cloud is a real-time data platform designed to unify and manage customer data from multiple sources across an organization, enabling a single, comprehensive view of each customer. It allows businesses to collect, harmonize, and analyze data in real time, creating a 360-degree customer profile that can be leveraged across Salesforce’s various applications, such as Marketing Cloud, Sales Cloud, and Service Cloud. This platform enables faster, more personalized customer interactions by integrating data from online and offline channels, including CRM data, transactional data, and third-party data sources. Salesforce Data Cloud also offers advanced AI gents and analytics capabilities, helping organizations gain deeper insights into customer behavior and predict future needs. By centralizing and refining data for actionable use, Salesforce Data Cloud supports enhanced customer experiences, targeted marketing, and efficient, data-driven decision-making across departments.
  • 8
    Unity Catalog

    Unity Catalog

    Databricks

    Databricks Unity Catalog is the industry’s only unified and open governance solution for data and AI, built into the Databricks Data Intelligence Platform. With Unity Catalog, organizations can seamlessly govern both structured and unstructured data in any format, as well as machine learning models, notebooks, dashboards, and files across any cloud or platform. Data scientists, analysts, and engineers can securely discover, access, and collaborate on trusted data and AI assets across platforms, leveraging AI to boost productivity and unlock the full potential of the lakehouse environment. This unified and open approach to governance promotes interoperability and accelerates data and AI initiatives while simplifying regulatory compliance. Easily discover and classify both structured and unstructured data in any format, including machine learning models, notebooks, dashboards, and files across all cloud platforms.
  • 9
    Observo AI

    Observo AI

    Observo AI

    ​Observo AI is an AI-native data pipeline platform designed to address the challenges of managing vast amounts of telemetry data in security and DevOps operations. By leveraging machine learning and agentic AI, Observo AI automates data optimization, enabling enterprises to process AI-generated data more efficiently, securely, and cost-effectively. It reduces data processing costs by over 50% and accelerates incident response times by more than 40%. Observo AI's features include intelligent data deduplication and compression, real-time anomaly detection, and dynamic data routing to appropriate storage or analysis tools. It also enriches data streams with contextual information to enhance threat detection accuracy while minimizing false positives. Observo AI offers a searchable cloud data lake for efficient data storage and retrieval.
  • 10
    Onum

    Onum

    Onum

    ​Onum is a real-time data intelligence platform that empowers security and IT teams to derive actionable insights from data in-stream, facilitating rapid decision-making and operational efficiency. By processing data at the source, Onum enables decisions in milliseconds, not minutes, simplifying complex workflows and reducing costs. It offers data reduction capabilities, intelligently filtering and reducing data at the source to ensure only valuable information reaches analytics platforms, thereby minimizing storage requirements and associated costs. It also provides data enrichment features, transforming raw data into actionable intelligence by adding context and correlations in real time. Onum simplifies data pipeline management through efficient data routing, ensuring the right data is delivered to the appropriate destinations instantly, supporting various sources and destinations.
  • 11
    Tenzir

    Tenzir

    Tenzir

    ​Tenzir is a data pipeline engine specifically designed for security teams, facilitating the collection, transformation, enrichment, and routing of security data throughout its lifecycle. It enables users to seamlessly gather data from various sources, parse unstructured data into structured formats, and transform it as needed. It optimizes data volume, reduces costs, and supports mapping to standardized schemas like OCSF, ASIM, and ECS. Tenzir ensures compliance through data anonymization features and enriches data by adding context from threats, assets, and vulnerabilities. It supports real-time detection and stores data efficiently in Parquet format within object storage systems. Users can rapidly search and materialize necessary data and reactivate at-rest data back into motion. Tension is built for flexibility, allowing deployment as code and integration into existing workflows, ultimately aiming to reduce SIEM costs and provide full control.
  • 12
    Borneo

    Borneo

    Borneo

    Borneo is a real-time data security and privacy observability platform designed to help organizations discover, remediate, and govern data risks while ensuring privacy and compliance. It enables users to discover where health data, financial data, and PII are stored across unstructured data, SaaS apps, and public cloud environments. Borneo's risk correlation engine identifies data that violates security frameworks and privacy regulations, prompting immediate action. It offers automatic remediation through data masking, access changes, and encryption, and continuously monitors changes across the data landscape to maintain compliance and eliminate regulatory risk. Built by security practitioners from Uber, Facebook, and Yahoo, Borneo is crafted to handle data at scale. It features a powerful connector framework to integrate across diverse data landscapes, supports flexible and modular deployment, and ensures that data never leaves the user's cloud environment.
  • 13
    TIBCO Streaming
    TIBCO Streaming is a real-time analytics platform designed to process and analyze high-velocity data streams, enabling organizations to make immediate, data-driven decisions. It offers a low-code development environment through StreamBase Studio, allowing users to build complex event processing applications with minimal coding. It supports over 150 connectors, including APIs, Apache Kafka, MQTT, RabbitMQ, and databases like MySQL and JDBC, facilitating seamless integration with various data sources. TIBCO Streaming incorporates dynamic learning operators, enabling adaptive machine learning models that provide contextual insights and automate decision-making processes. It also features real-time business intelligence capabilities, allowing users to visualize live data alongside historical information for comprehensive analysis. It is cloud-ready, supporting deployments on AWS, Azure, GCP, and on-premises environments.
  • 14
    Conduktor

    Conduktor

    Conduktor

    We created Conduktor, the all-in-one friendly interface to work with the Apache Kafka ecosystem. Develop and manage Apache Kafka with confidence. With Conduktor DevTools, the all-in-one Apache Kafka desktop client. Develop and manage Apache Kafka with confidence, and save time for your entire team. Apache Kafka is hard to learn and to use. Made by Kafka lovers, Conduktor best-in-class user experience is loved by developers. Conduktor offers more than just an interface over Apache Kafka. It provides you and your teams the control of your whole data pipeline, thanks to our integration with most technologies around Apache Kafka. Provide you and your teams the most complete tool on top of Apache Kafka.
  • 15
    Conduit

    Conduit

    Conduit

    Sync data between your production systems using an extensible, event-first experience with minimal dependencies that fit within your existing workflow. Eliminate the multi-step process you go through today. Just download the binary and start building. Conduit pipelines listen for changes to a database, data warehouse, etc., and allows your data applications to act upon those changes in real-time. Conduit connectors give you the ability to pull and push data to any production datastore you need. If a datastore is missing, the simple SDK allows you to extend Conduit where you need it. Run it in a way that works for you; use it as a standalone service or orchestrate it within your infrastructure.
  • 16
    OctoData

    OctoData

    SoyHuCe

    OctoData is deployed at a lower cost, in Cloud hosting and includes personalized support from the definition of your needs to the use of the solution. OctoData is based on innovative open-source technologies and knows how to adapt to open up to future possibilities. Its Supervisor offers a management interface that allows you to quickly capture, store and exploit a growing quantity and variety of data. With OctoData, prototype and industrialize your massive data recovery solutions in the same environment, including in real time. Thanks to the exploitation of your data, obtain precise reports, explore new possibilities, increase your productivity and gain in profitability.
  • 17
    SSIS Integration Toolkit
    Jump right to our product page to see our full range of data integration software, including solutions for SharePoint and Active Directory. With over 300 individual data integration tools for connectivity and productivity, our data integration solutions allow developers to take advantage of the flexibility and power of the SSIS ETL engine to integrate virtually any application or data source. You don't have to write a single line of code to make data integration happen so your development can be done in a matter of minutes. We make the most flexible integration solution on the market. Our software offers intuitive user interfaces that are flexible and easy to use. With a streamlined development experience and an extremely simple licensing model, our solution offers the best value for your investment. Our software offers many specifically designed features that help you achieve the best possible performance without having to hijack your budget.
  • 18
    BigBI

    BigBI

    BigBI

    BigBI enables data specialists to build their own powerful big data pipelines interactively & efficiently, without any coding! BigBI unleashes the power of Apache Spark enabling: Scalable processing of real Big Data (up to 100X faster) Integration of traditional data (SQL, batch files) with modern data sources including semi-structured (JSON, NoSQL DBs, Elastic, Hadoop), and unstructured (Text, Audio, video), Integration of streaming data, cloud data, AI/ML & graphs
  • 19
    Precisely Connect
    Integrate data seamlessly from legacy systems into next-gen cloud and data platforms with one solution. Connect helps you take control of your data from mainframe to cloud. Integrate data through batch and real-time ingestion for advanced analytics, comprehensive machine learning and seamless data migration. Connect leverages the expertise Precisely has built over decades as a leader in mainframe sort and IBM i data availability and security to lead the industry in accessing and integrating complex data. Access to all your enterprise data for the most critical business projects is ensured by support for a wide range of sources and targets for all your ELT and CDC needs.
  • 20
    Solitics

    Solitics

    Solitics

    Leverage Your Data In Real-Time For: Marketing Automation Retention. Automation Personalization Business Intelligence Support. Seamlessly integrate and utilize all your data, in real-time, from one place, for any business need. Enabling Companies To Build Amazing Products. Solitics centralizes your data and provides you with the best utilization tools out there. Whether it’s automation, personalization, product enhancement, analytics or execution – Solitics has got you covered! Marketing Automation. Real-time customer journeys based on all your data Personalized engagements across any marketing channel. Automated analysis of your retention and conversion performance. Machine-based decisions. Technology That Spans Across Industries. Gaming. Trading eCommerce. B2C Brands. For Marketing Teams. Best in class attribution. Analyze campaign impact on revenues across all marketing channels, using data streamed from backend sources.
  • 21
    Xeotek

    Xeotek

    Xeotek

    Xeotek helps companies develop and explore their data applications and streams faster with Xeotek's powerful desktop and web application. Xeotek KaDeck was designed to be used by developers, operations, and business users alike. Because business users, developers, and operations jointly gain insight into data and processes via KaDeck, the whole team benefits: fewer misunderstandings, less rework, more transparency. Xeotek KaDeck puts you in control of your data streams. Save hours of work by gaining insights at the data and application level in projects or day-to-day operations. Export, filter, transform and manage data streams in KaDeck with ease. Run JavaScript (NodeV4) code, transform & generate test data, view & change consumer offsets, manage your streams or topics, Kafka Connect instances, schema registry, and ACLs – all from one convenient user interface.