Compare the Top Big Data Platforms in Germany as of December 2025 - Page 6

  • 1
    SAP HANA
    SAP HANA in-memory database is for transactional and analytical workloads with any data type — on a single data copy. It breaks down the transactional and analytical silos in organizations, for quick decision-making, on premise and in the cloud. Innovate without boundaries on a database management system, where you can develop intelligent and live solutions for quick decision-making on a single data copy. And with advanced analytics, you can support next-generation transactional processing. Build data solutions with cloud-native scalability, speed, and performance. With the SAP HANA Cloud database, you can gain trusted, business-ready information from a single solution, while enabling security, privacy, and anonymization with proven enterprise reliability. An intelligent enterprise runs on insight from data – and more than ever, this insight must be delivered in real time.
  • 2
    FlowWright
    Business Process Management Software (BPMS) & BPM Workflow Automation Tool. Companies need workflow, forms, compliance, and automation routing support. Our low-code options make creating + editing workflows simple. Our best-in-class forms capabilities, make it possible to rapidly build forms, forms logic, and workflows for forms-driven workflow processes. Companies have many existing systems in place that need to work with each other. Our business process integrations across systems are loosely-coupled + intelligently integrated. When you use FlowWright to automate your business, you gain access to standard metrics and metrics that you define. BPM analytics are a key part of any BPM workflow management software solution. FlowWright can be deployed as a cloud solution or deployed in an on-premise or .NET hosted environment (including AWS and Azure). It was built in .NET Foundation C# code and all tools are fully browser-based, requiring no plug-ins.
  • 3
    Striim

    Striim

    Striim

    Data integration for your hybrid cloud. Modern, reliable data integration across your private and public cloud. All in real-time with change data capture and data streams. Built by the executive & technical team from GoldenGate Software, Striim brings decades of experience in mission-critical enterprise workloads. Striim scales out as a distributed platform in your environment or in the cloud. Scalability is fully configurable by your team. Striim is fully secure with HIPAA and GDPR compliance. Built ground up for modern enterprise workloads in the cloud or on-premise. Drag and drop to create data flows between your sources and targets. Process, enrich, and analyze your streaming data with real-time SQL queries.
  • 4
    Upsolver

    Upsolver

    Upsolver

    Upsolver makes it incredibly simple to build a governed data lake and to manage, integrate and prepare streaming data for analysis. Define pipelines using only SQL on auto-generated schema-on-read. Easy visual IDE to accelerate building pipelines. Add Upserts and Deletes to data lake tables. Blend streaming and large-scale batch data. Automated schema evolution and reprocessing from previous state. Automatic orchestration of pipelines (no DAGs). Fully-managed execution at scale. Strong consistency guarantee over object storage. Near-zero maintenance overhead for analytics-ready data. Built-in hygiene for data lake tables including columnar formats, partitioning, compaction and vacuuming. 100,000 events per second (billions daily) at low cost. Continuous lock-free compaction to avoid “small files” problem. Parquet-based tables for fast queries.
  • 5
    Adtelligence

    Adtelligence

    ADTELLIGENCE GmbH

    Use existing customer data to generate insights. Learn about your customers to forecast their behavior. Train your website to proactively identify visitor needs and respond with the best possible content. Automate your processes and enable the engine to learn from its experiences. This allows you to activate your customer with the right offer at the right time – automated and in real time. Relieve your team of time consuming, repetitive tasks and extend the capacity for achieving your goals. We deliver smart, future oriented software solutions to ensure maximum effectiveness for all your sales processes.
  • 6
    Qubole

    Qubole

    Qubole

    Qubole is a simple, open, and secure Data Lake Platform for machine learning, streaming, and ad-hoc analytics. Our platform provides end-to-end services that reduce the time and effort required to run Data pipelines, Streaming Analytics, and Machine Learning workloads on any cloud. No other platform offers the openness and data workload flexibility of Qubole while lowering cloud data lake costs by over 50 percent. Qubole delivers faster access to petabytes of secure, reliable and trusted datasets of structured and unstructured data for Analytics and Machine Learning. Users conduct ETL, analytics, and AI/ML workloads efficiently in end-to-end fashion across best-of-breed open source engines, multiple formats, libraries, and languages adapted to data volume, variety, SLAs and organizational policies.
  • 7
    AVEVA PI System
    The PI System unlocks operational insights and new possibilities. The PI System enables digital transformation through trusted, high-quality operations data. Collect, enhance, and deliver data in real time in any location. Empower engineers and operators. Accelerate the work of analysts and data scientists. Support new business opportunities. Collect real-time data from hundreds of assets—including legacy, proprietary, remote, mobile, and IIoT devices. The PI System connects you to your data, no matter the location or format. Store decades worth of data with sub-second resolution. Get immediate access to high-fidelity historical, real-time, and predictive data to keep critical operations running and business insights coming. Make data more meaningful by adding intuitive labels and metadata. Define data hierarchies that reflect your operating and reporting environments. With context, you don’t just see a data point, you see the big picture.
  • 8
    Exasol

    Exasol

    Exasol

    With an in-memory, columnar database and MPP architecture, you can query billions of rows in seconds. Queries are distributed across all nodes in a cluster, providing linear scalability for more users and advanced analytics. MPP, in-memory, and columnar storage add up to the fastest database built for data analytics. With SaaS, cloud, on premises and hybrid deployment options you can analyze data wherever it lives. Automatic query tuning reduces maintenance and overhead. Seamless integrations and performance efficiency gets you more power at a fraction of normal infrastructure costs. Smart, in-memory query processing allowed this social networking company to boost performance, processing 10B data sets a year. A single data repository and speed engine to accelerate critical analytics, delivering improved patient outcome and bottom line.
  • 9
    ELUTIONS

    ELUTIONS

    Elutions

    Optimizing performance across the value chain and delivering competitive advantage through Artificial Intelligence. Elutions is a global, US-based Artificial Intelligence company that empowers corporations to dynamically optimize activity across the value chain, transform performance and profitability and gain competitive advantage through its enterprise-level AI solution, Maestro. By autonomously identifying and implementing directives that enhance end-to-end efficiency, reliability, uptime, quality and yield, Maestro delivered more than $1B in client value across verticals such as oil & gas, manufacturing and utilities in 2019.
  • 10
    AtScale

    AtScale

    AtScale

    AtScale helps accelerate and simplify business intelligence resulting in faster time-to-insight, better business decisions, and more ROI on your Cloud analytics investment. Eliminate repetitive data engineering tasks like curating, maintaining and delivering data for analysis. Define business definitions in one location to ensure consistent KPI reporting across BI tools. Accelerate time to insight from data while efficiently managing cloud compute costs. Leverage existing data security policies for data analytics no matter where data resides. AtScale’s Insights workbooks and models let you perform Cloud OLAP multidimensional analysis on data sets from multiple providers – with no data prep or data engineering required. We provide built-in easy to use dimensions and measures to help you quickly derive insights that you can use for business decisions.
  • 11
    Oracle Fusion Cloud EPM
    Gain the agility and insights you need to outperform in any market condition. Oracle Fusion Cloud Enterprise Performance Management (EPM) helps you model and plan across finance, HR, supply chain, and sales, streamline the financial close process, and drive better decisions. Comprehensively address your needs with functional breadth and depth across financial and operational planning, consolidation and close, master data management, and more. Seamlessly connect finance with all other lines of business for enterprise-wide agility and alignment. Drive better decisions with scenario modeling and built-in, advanced analytics. Oracle EPM consistently tops analyst rankings; thousands of customers gain more value from running their EPM processes with Oracle in the cloud. Drive agile, connected plans—from scenario modeling and long-range planning to budgeting and line of business planning—that are built on best practices and advanced technologies.
  • 12
    Micropole

    Micropole

    Micropole

    Micropole is a consulting and engineering company, with bases in Europe and Asia, specializing in the creation of added-value. Micropole partners its customers in the Performance Management, Digital Transformation and Data Governance fields. At Micropole Group, we are convinced that optimizing companies' data assets is the key to their performance. Every day our Innovative People detect trends and explore new territories. Their mission is to make companies data intelligent and help them transform themselves to prepare their future. A privileged partner of major international software vendors, our ambition is to boost the distinctiveness of your corporation through efficient business solutions and innovative cutting-edge technologies. Micropole is a consulting, engineering and training company specialized in the development and integration of decision-support, Performance Management, Digital Transformation and Data Governance solutions.
  • 13
    Oracle MDM
    To fully understand master data management (MDM), we must first define and explain master data and differentiate between it and master data management. Master data is the critical business information that supports and classifies the transactional and analytical data of an enterprise. It may also be referred to as “enterprise data” or “metadata,” and often includes application-specific metadata, alternative business perspectives, corporate dimensions, reference data, and master data assets. Examples of enterprise data include chart of accounts, organization or cost-center structures, market segments, product categories, and more. Master data management (MDM) does exactly what the name implies—manages master data. MDM is the combination of applications and technologies that consolidates, cleanses, and augments this master data and synchronizes it with applications, business processes, and analytical tools.
  • 14
    Datalytics

    Datalytics

    Datalytics

    Our team of talents has the necessary creativity, wit, and implementation capacity to help organizations to analyze and understand data, and turn it into intelligent decisions. Datalytics features over 10 years of experience in data integration, visualization, and mining, Big Data, predictive analytics, and data science. Our main goal is to become organizations’ strategic allies helping them to recognize the actual possibilities to analyze data. We offer our wit and analytical creativity for organizations to be able to make intelligent decisions and create new business opportunities. In our century, information is the most relevant asset. These are the services we offer in order to understand, analyze, and transform data. Data architecture, training, and technical assessment are some of the tactics we implement to provide the Big Data service.
  • 15
    NextGen Population Health

    NextGen Population Health

    NextGen Healthcare

    Meet the challenges of value based care—no matter your current EHR. Get a clear view into your patient population with aggregated multi-source data and an easy-to-navigate visual display. Use insights based in data to better manage chronic conditions and care transitions, prevent illness, lower costs, and implement care management. Facilitate care coordination with tools that encourage a proactive approach, including a pre-visit dashboard, risk stratification, and automated tracking of admission, discharge, and transfer events. Put care management in operation. Extend physician reach. Foster critical interactions with patients and valuable follow-up in between appointments. Identify patients with the greatest risk for high-cost utilization, using the Johns Hopkins ACG system for risk stratification. Accurately assign resources where intervention is needed most. Improve performance on quality measures. Participate successfully in value-based payment programs and optimize reimbursement.
  • 16
    Arundo Enterprise
    Arundo Enterprise is a modular, flexible software suite to create data products for people. We connect live data to machine learning and other analytical models, and model outputs to business decisions. Arundo Edge Agent enables industrial connectivity and analytics in rugged, remote, or disconnected environments. Arundo Composer allows data scientists to quickly and easily deploy desktop-based analytical models into the Arundo Fabric cloud environment with a single command. Composer also enables companies to create and manage live data streams and integrate such streams with deployed data models. Arundo Fabric is the cloud-based hub for deployed machine learning models, data streams, edge agent management, and quick navigation to extended applications. Arundo offers a portfolio of high ROI SaaS products. Each of these solutions comes with a core out-of-the-box functional capability that leverages the core strengths of Arundo Enterprise.
  • 17
    Peak DSP

    Peak DSP

    Peak DSP (by Edge 226)

    Edge 226 is a global provider of data-driven tech solutions, focused on providing its clients with smart tools for quality and transparent user acquisition. Edge’s leading product is Peak DSP, a Performance-Driven DSP that enables programmatic buying for quality user acquisition and re-engagement. Peak DSP offers: • An AI-driven algorithm optimizing and predicting install & post-install events: registrations, subscriptions, purchases or any other action • Data-based targeting with Lookalike Audiences, External User Data and Audience Match • Direct integrations: ­ Owned & operated and direct apps ­ Mobile device manufacturers & carrier-based supply ­ Over 35 of the world’s top SSPs • All verticals and environments: Gaming, shopping, utilities, sports (etc.) campaigns across in-app, mobile web and desktop • Multiple creative types: ­ Rewarded video ­ Playable ads ­ Banners, native ads & text ads ­ HTML/Rich Media ­ JavaScript tags
  • 18
    MX

    MX

    MX Technologies

    MX helps financial institutions and fintechs utilize their data more effectively to outperform the competition in a rapidly evolving industry. Our solutions enable clients to quickly and easily collect, enhance, analyze, present, and act on their financial data. MX puts a user’s data on center stage, molding it into a cohesive, intelligible, and interactive visualization. As a result, users engage more often and more deeply with your digital banking products. The Helios cross-platform framework gives MX clients the ability to offer mobile banking across a range of platforms and device types — all built from a single C++ codebase. This dramatically lowers maintenance costs and powers agile development.
  • 19
    Sigma

    Sigma

    Sigma Computing

    Sigma is a modern business intelligence (BI) and analytics application built for the cloud. Trusted by data-first companies, Sigma provides live access to cloud data warehouses using an intuitive spreadsheet interface empowering business experts to ask more of their data without writing a single line of code. With the full power of SQL, the cloud, and a familiar interface, business users have the freedom to analyze data in real time without limits. Sigma is self-service analytics as it was meant to be.
  • 20
    Pickaxe

    Pickaxe

    Pickaxe Foundry

    Give your business the power of hundreds of data scientists and analysts. AI powered data analytics that anyone can use and understand. Stop just spending all your time pulling data to explain what has happened, and instead focus on building a persuasive story of what you should do next. Pickaxe does it all for you, in realtime, with AI-powered dashboards and deep human insights. Your data platform can tell you ‘what’ is happening, but can it also tell you ‘so what’ and ‘now what’?
  • 21
    SafeGraph

    SafeGraph

    SafeGraph

    Unlock innovation with the most accurate Points-of-Interest (POI) data, business listings, & store visitor insights data for commercial places in the U.S. Business listing and building footprint dataset for every place people spend money in the U.S. (~5MM POIs). Covers locations for major retail chains, shopping malls, convenience stores, airports, & more. Store visitor analytics, foot-traffic counts, and demographic insights data for POI. Data can answers questions such as: how often do people visit stores, where did they come from, & where else do they shop? Seamlessly integrate your existing POI data with SafeGraph's enriched Places data. Business category, open hours, visit count, popular times and more are associated with each place. The top 5,000+ brands are mapped to over 1MM POI. Noisy locations are removed (ATMs, Red Box kiosks, etc.). Closed stores are filtered out. Irrelevant businesses (like home LLCs with no employees) are kept out.
  • 22
    BryteFlow

    BryteFlow

    BryteFlow

    BryteFlow builds the most efficient automated environments for analytics ever. It converts Amazon S3 into an awesome analytics platform by leveraging the AWS ecosystem intelligently to deliver data at lightning speeds. It complements AWS Lake Formation and automates the Modern Data Architecture providing performance and productivity. You can completely automate data ingestion with BryteFlow Ingest’s simple point-and-click interface while BryteFlow XL Ingest is great for the initial full ingest for very large datasets. No coding is needed! With BryteFlow Blend you can merge data from varied sources like Oracle, SQL Server, Salesforce and SAP etc. and transform it to make it ready for Analytics and Machine Learning. BryteFlow TruData reconciles the data at the destination with the source continually or at a frequency you select. If data is missing or incomplete you get an alert so you can fix the issue easily.
  • 23
    Edge Intelligence

    Edge Intelligence

    Edge Intelligence

    Start benefiting your business within minutes of installation. Learn how our system works. It's the fastest, easiest way to analyze vast amounts of geographically distributed data. A new approach to analytics. Overcome the architectural constraints associated with traditional big data warehouses, database design and edge computing architectures. Understand details within the platform that allow for centralized command & control, automated software installation & orchestration and geographically distributed data input & storage.
  • 24
    Intelligent Artifacts

    Intelligent Artifacts

    Intelligent Artifacts

    A new category of AI. Most current AI solutions are engineered through a statistical and purely mathematical lens. We took a different approach. With discoveries in information theory, the team at Intelligent Artifacts has built a new category of AI: a true AGI that eliminates current machine intelligence shortcomings. Our framework keeps the data and application layers separate from the intelligence layer allowing it to learn in real-time, and enabling it to explain predictions down to root cause. A true AGI demands a truly integrated platform. With Intelligent Artifacts, you'll model information, not data — predictions and decisions are real-time and transparent, and can be deployed across various domains without the need to rewrite code. And by combining specialized AI consultants with our dynamic platform, you'll get a customized solution that rapidly offers deep insights and greater outcomes from your data.
  • 25
    HEAVY.AI

    HEAVY.AI

    HEAVY.AI

    HEAVY.AI is the pioneer in accelerated analytics. The HEAVY.AI platform is used in business and government to find insights in data beyond the limits of mainstream analytics tools. Harnessing the massive parallelism of modern CPU and GPU hardware, the platform is available in the cloud and on-premise. HEAVY.AI originated from research at Harvard and MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Expand beyond the limitations of traditional BI and GIS by leveraging the full power of modern GPU and CPU hardware so you can extract decision-quality information from your massive datasets without lag. Unify and explore your largest geospatial and time-series datasets to get the complete picture of the what, when, and where. Combine interactive visual analytics, hardware-accelerated SQL, and an advanced analytics & data science framework to find opportunity and risk hidden in your enterprise when you need to most.
  • 26
    Hadoop

    Hadoop

    Apache Software Foundation

    The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures. A wide variety of companies and organizations use Hadoop for both research and production. Users are encouraged to add themselves to the Hadoop PoweredBy wiki page. Apache Hadoop 3.3.4 incorporates a number of significant enhancements over the previous major release line (hadoop-3.2).
  • 27
    Apache Spark

    Apache Spark

    Apache Software Foundation

    Apache Spark™ is a unified analytics engine for large-scale data processing. Apache Spark achieves high performance for both batch and streaming data, using a state-of-the-art DAG scheduler, a query optimizer, and a physical execution engine. Spark offers over 80 high-level operators that make it easy to build parallel apps. And you can use it interactively from the Scala, Python, R, and SQL shells. Spark powers a stack of libraries including SQL and DataFrames, MLlib for machine learning, GraphX, and Spark Streaming. You can combine these libraries seamlessly in the same application. Spark runs on Hadoop, Apache Mesos, Kubernetes, standalone, or in the cloud. It can access diverse data sources. You can run Spark using its standalone cluster mode, on EC2, on Hadoop YARN, on Mesos, or on Kubernetes. Access data in HDFS, Alluxio, Apache Cassandra, Apache HBase, Apache Hive, and hundreds of other data sources.
  • 28
    Incorta

    Incorta

    Incorta

    Direct is the shortest path from data to insight. Incorta empowers everyone in your business with a true self-service data experience and breakthrough performance for better decisions and incredible results. What if you could bypass fragile ETL and expensive data warehouses, and deliver data projects in days, instead of weeks or months? Our direct approach to analytics delivers true self-service in the cloud or on-premises with agility and performance. Incorta is used by the world’s largest brands to succeed where other analytics solutions fail. Across multiple industries and lines of business, we boast connectors and pre-built solutions for your enterprise applications and technologies. Game-changing innovation and customer success happen through Incorta’s partners including Microsoft, AWS, eCapital, and Wipro. Explore or join our thriving partner ecosystem.
  • 29
    Amazon EMR
    Amazon EMR is the industry-leading cloud big data platform for processing vast amounts of data using open-source tools such as Apache Spark, Apache Hive, Apache HBase, Apache Flink, Apache Hudi, and Presto. With EMR you can run Petabyte-scale analysis at less than half of the cost of traditional on-premises solutions and over 3x faster than standard Apache Spark. For short-running jobs, you can spin up and spin down clusters and pay per second for the instances used. For long-running workloads, you can create highly available clusters that automatically scale to meet demand. If you have existing on-premises deployments of open-source tools such as Apache Spark and Apache Hive, you can also run EMR clusters on AWS Outposts. Analyze data using open-source ML frameworks such as Apache Spark MLlib, TensorFlow, and Apache MXNet. Connect to Amazon SageMaker Studio for large-scale model training, analysis, and reporting.
  • 30
    Kraken

    Kraken

    Big Squid

    Kraken is for everyone from analysts to data scientists. Built to be the easiest-to-use, no-code automated machine learning platform. The Kraken no-code automated machine learning (AutoML) platform simplifies and automates data science tasks like data prep, data cleaning, algorithm selection, model training, and model deployment. Kraken was built with analysts and engineers in mind. If you've done data analysis before, you're ready! Kraken's no-code, easy-to-use interface and integrated SONAR© training make it easy to become a citizen data scientist. Advanced features allow data scientists to work faster and more efficiently. Whether you use Excel or flat files for day-to-day reporting or just ad-hoc analysis and exports, drag-and-drop CSV upload and the Amazon S3 connector in Kraken make it easy to start building models with a few clicks. Data Connectors in Kraken allow you to connect to your favorite data warehouse, business intelligence tools, and cloud storage.
    Starting Price: $100 per month