Alternatives to Nixtla

Compare Nixtla alternatives for your business or organization using the curated list below. SourceForge ranks the best alternatives to Nixtla in 2025. Compare features, ratings, user reviews, pricing, and more from Nixtla competitors and alternatives in order to make an informed decision for your business.

  • 1
    Clari

    Clari

    Clari

    A Revenue Operations Platform that accelerates revenue results. Automated CRM updates? Check. Time series analysis? Check. But Clari is much more than innovative features. By combining revenue intelligence with forecasting and execution insights, Clari solves your real problem—efficiently and predictably hitting your targets, quarter after quarter, year after year. Purpose-built to drive more predictable revenue, Clari’s Revenue Operations Platform takes previously untapped data—from email, CRM, call logs and beyond—and turns it into execution insights for your entire revenue team. Clari backs up human intuition with AI insights, so your team can forecast with newfound accuracy and foresight—using a consistent, automated process that flexes to manage every business in your company. Harvest valuable activity data from reps, prospects and customers so you always know what’s going on in your deals, your teams, and in your business.
  • 2
    Google Cloud Timeseries Insights API
    Anomaly detection in time series data is essential for the day-to-day operation of many companies. With Timeseries Insights API Preview, you can gather insights in real-time from your time-series datasets. Get everything you need to understand your API query results, such as anomaly events, forecasted range of values, and slices of events that were examined. Stream data in real-time, making it possible to detect anomalies while they are happening. Rely on Google Cloud's end-to-end infrastructure and defense-in-depth approach to security that's been innovated for over 15 years through consumer apps like Gmail and Search. At its core, Timeseries Insights API is fully integrated with other Google Cloud Storage services, providing you with a consistent method of access across storage products. Detect trends and anomalies with multiple event dimensions. Handle datasets consisting of tens of billions of events. Run thousands of queries per second.
  • 3
    Google Cloud Inference API
    Time-series analysis is essential for the day-to-day operation of many companies. Most popular use cases include analyzing foot traffic and conversion for retailers, detecting data anomalies, identifying correlations in real-time over sensor data, or generating high-quality recommendations. With Cloud Inference API Alpha, you can gather insights in real-time from your typed time-series datasets. Get everything you need to understand your API queries results, such as groups of events that were examined, the number of groups of events, and the background probability of each returned event. Stream data in real-time, making it possible to compute correlations for real-time events. Rely on Google Cloud’s end-to-end infrastructure and defense-in-depth approach to security that’s been innovated on for over 15 years through consumer apps. At its core, Cloud Inference API is fully integrated with other Google Cloud Storage services.
  • 4
    Azure AI Anomaly Detector
    Foresee problems before they occur with an Azure AI anomaly detection service. Easily embed time-series anomaly detection capabilities into your apps to help users identify problems quickly. AI Anomaly Detector ingests time-series data of all types and selects the best anomaly detection algorithm for your data to ensure high accuracy. Detect spikes, dips, deviations from cyclic patterns, and trend changes through both univariate and multivariate APIs. Customize the service to detect any level of anomaly. Deploy the anomaly detection service where you need it, in the cloud or at the intelligent edge. A powerful inference engine assesses your time-series dataset and automatically selects the right anomaly detection algorithm to maximize accuracy for your scenario. Automatic detection eliminates the need for labeled training data to help you save time and stay focused on fixing problems as soon as they surface.
  • 5
    Amazon Forecast
    Amazon Forecast is a fully managed service that uses machine learning to deliver highly accurate forecasts. Companies today use everything from simple spreadsheets to complex financial planning software to attempt to accurately forecast future business outcomes such as product demand, resource needs, or financial performance. These tools build forecasts by looking at a historical series of data, which is called time series data. For example, such tools may try to predict the future sales of a raincoat by looking only at its previous sales data with the underlying assumption that the future is determined by the past. This approach can struggle to produce accurate forecasts for large sets of data that have irregular trends. Also, it fails to easily combine data series that change over time (such as price, discounts, web traffic, and number of employees) with relevant independent variables like product features and store locations.
  • 6
    Warp 10
    Warp 10 is a modular open source platform that collects, stores, and analyzes data from sensors. Shaped for the IoT with a flexible data model, Warp 10 provides a unique and powerful framework to simplify your processes from data collection to analysis and visualization, with the support of geolocated data in its core model (called Geo Time Series). Warp 10 is both a time series database and a powerful analytics environment, allowing you to make: statistics, extraction of characteristics for training models, filtering and cleaning of data, detection of patterns and anomalies, synchronization or even forecasts. The analysis environment can be implemented within a large ecosystem of software components such as Spark, Kafka Streams, Hadoop, Jupyter, Zeppelin and many more. It can also access data stored in many existing solutions, relational or NoSQL databases, search engines and S3 type object storage system.
  • 7
    Shapelets

    Shapelets

    Shapelets

    Powerful computing at your fingertips. Parallel computing, groundbreaking algorithms, so what are you waiting for? Designed to empower data scientists in business. Get the fastest computing in an all-inclusive time-series platform. Shapelets provides you with analytical features such as causality, discords and motif discovery, forecasting, clustering, etc. Run, extend and integrate your own algorithms into the Shapelets platform to make the most of Big Data analysis. Shapelets integrates seamlessly with any data collection and storage solution. It also integrates with MS Office and any other visualization tool to simplify and share insights without any technical acumen. Our UI works with the server to bring you interactive visualizations. You can make the most of your metadata and represent it in the many different visual graphs provided by our modern interface. Shapelets enables users from the oil, gas, and energy industry to perform real-time analysis of operational data.
  • 8
    evoML

    evoML

    TurinTech AI

    evoML accelerates the creation of production-quality machine learning models by streamlining and automating the end-to-end data science workflow, transforming raw data into actionable insights in days instead of weeks. It automates crucial steps, automatic data transformation that detects anomalies and handles imbalances, feature engineering via genetic algorithms, parallel model evaluation across thousands of candidates, multi-objective optimization on custom metrics, and GenAI-based synthetic data generation for rapid prototyping under data-privacy constraints. Users fully own and customize generated model code for seamless deployment as APIs, databases, or local libraries, avoiding vendor lock-in and ensuring transparent, auditable workflows. EvoML empowers teams with intuitive visualizations, interactive dashboards, and charts to identify patterns, outliers, and anomalies for use cases such as anomaly detection, time-series forecasting, and fraud prevention.
  • 9
    VictoriaMetrics Anomaly Detection
    VictoriaMetrics Anomaly Detection is a service that continuously scans time series stored in VictoriaMetrics and detects unexpected changes within data patterns in real time. It does so by utilizing user-configurable machine learning models. In the dynamic and complex world of system monitoring, VictoriaMetrics Anomaly Detection, a part of our Enterprise offering, is a pivotal tool for achieving advanced observability. It empowers SREs and DevOps teams by automating the intricate task of identifying abnormal behavior in time-series data. It goes beyond traditional threshold-based alerting, utilizing machine learning techniques to detect anomalies and minimize false positives, thus reducing alert fatigue. Providing simplified alerting mechanisms atop unified anomaly scores enables teams to spot and address potential issues faster, ensuring system reliability and operational efficiency.
  • 10
    RemoteAware GenAI Analytics Platform

    RemoteAware GenAI Analytics Platform

    New Boundary Technologies

    RemoteAware™ GenAI Analytics Platform for IoT transforms complex streams of sensor and device data into clear, actionable insights using advanced generative AI models. It ingests and normalizes high‑volume, heterogeneous IoT data, whether from edge gateways, cloud APIs, or remote assets, and applies scalable AI pipelines to detect anomalies, forecast equipment failures, and generate prescriptive recommendations in plain‑language narratives. Through a unified, web‑based dashboard, users gain real‑time visibility into key performance indicators, customizable alerts and threshold‑based notifications, and dynamic drill‑down capabilities for time‑series analysis. The platform’s generative summary reports condense vast datasets into concise operational briefs, while its root‑cause analysis and what‑if simulations guide preventive maintenance and resource allocation.
  • 11
    PipelineDB

    PipelineDB

    PipelineDB

    PipelineDB is a PostgreSQL extension for high-performance time-series aggregation, designed to power realtime reporting and analytics applications. PipelineDB allows you to define continuous SQL queries that perpetually aggregate time-series data and store only the aggregate output in regular, queryable tables. You can think of this concept as extremely high-throughput, incrementally updated materialized views that never need to be manually refreshed. Raw time-series data is never written to disk, making PipelineDB extremely efficient for aggregation workloads. Continuous queries produce their own output streams, and thus can be chained together into arbitrary networks of continuous SQL.
  • 12
    Odyx yHat

    Odyx yHat

    Odyssey Analytics

    Odyx yHat is a Time Series Forecasting tool designed to simplify the intricate field of data science, making it accessible and user-friendly for individuals without any background in data science.
    Starting Price: $300/month
  • 13
    Dominate

    Dominate

    BigBear.ai

    Gain decision advantage and maintain a competitive edge over adversaries with an advanced, automated capability to make sense of multiple streams of data to derive meaningful insights. Our time series forecasting engine, Dominate, analyzes economic indicators, global indices, media data, and more to enable supply chain planning and forecast probable future outcomes. Our cutting-edge approach to data preparation has been proven in some of the world’s most complex environments. Utilizes AI/ML to leverage the relationships between more complete data elements to influence your outcomes. Multi-step, multi-factor, multi-target, autoregressive models can flexibly forecast multiple values accurately and update them on-demand. Dominate provides certainty in shaping conditions to reveal unexpected insights and develop innovative courses of action. Tensor completion addresses error-prone and gap-riddled data and provides time-series forecasting, alerting, and impact analysis.
  • 14
    Azure Time Series Insights
    Azure Time Series Insights Gen2 is an open and scalable end-to-end IoT analytics service featuring best-in-class user experiences and rich APIs to integrate its powerful capabilities into your existing workflow or application. You can use it to collect, process, store, query and visualize data at Internet of Things (IoT) scale--data that's highly contextualized and optimized for time series. Azure Time Series Insights Gen2 is designed for ad hoc data exploration and operational analysis allowing you to uncover hidden trends, spotting anomalies, and conduct root-cause analysis. It's an open and flexible offering that meets the broad needs of industrial IoT deployments.
    Starting Price: $36.208 per unit per month
  • 15
    TimescaleDB

    TimescaleDB

    Tiger Data

    TimescaleDB is the leading time-series database built on PostgreSQL, designed to handle massive volumes of real-time data efficiently. It enables organizations to store, analyze, and query time-series data — such as IoT sensor data, financial transactions, or event logs — using standard SQL. With hypertables, TimescaleDB automatically partitions data by time and ID for fast ingestion and predictable query performance. Its compression engine reduces storage costs by up to 95%, while continuous aggregates make real-time dashboards instantly responsive. Fully compatible with PostgreSQL, it integrates seamlessly with existing tools and applications. TimescaleDB combines the simplicity of Postgres with the scalability and speed of a specialized analytical system.
  • 16
    Robyn

    Robyn

    Meta

    Robyn is an open source, experimental Marketing Mix Modeling (MMM) package developed by Meta’s Marketing Science team. It’s designed to help advertisers and analysts build rigorous, data-driven models that quantify how different marketing channels contribute to business outcomes (like sales, conversions, or other KPIs) in a privacy-safe, aggregated way. Rather than relying on user-level tracking, Robyn analyzes historical time-series data, combining marketing spend or reach data (ads, promotions, organic efforts, etc.) with outcome metrics, to estimate incremental impact, saturation effects, and carry-over (adstock) dynamics. Under the hood, Robyn blends classical statistical methods with modern machine learning and optimization; it uses ridge regression (to regularize against multicollinearity in many-channel models), time-series decomposition to isolate trend and seasonality, and a multi-objective evolutionary algorithm.
    Starting Price: Free
  • 17
    DataGen

    DataGen

    DataGen

    DataGen is a leading AI platform specializing in synthetic data generation and custom generative AI models for machine learning projects. Their flagship product, SynthEngyne, supports multi-format data generation including text, images, tabular, and time-series data, ensuring privacy-compliant, high-quality training datasets. The platform offers scalable, real-time processing and advanced quality controls like deduplication to maintain dataset fidelity. DataGen also provides professional AI development services such as model deployment, fine-tuning, synthetic data consulting, and intelligent automation systems. With flexible pricing plans ranging from free tiers for individuals to custom enterprise solutions, DataGen caters to a wide range of users. Their solutions serve diverse industries including healthcare, finance, automotive, and retail.
  • 18
    Tangent Works

    Tangent Works

    Tangent Works

    Drive business value from predictive analytics. Make informed decisions and improve processes. Create predictive models in seconds for faster and better forecasting & anomaly detection. TIM InstantML is a hyper-automated, augmented machine learning solution for time series data for better, faster, and more accurate forecasting, anomaly detection, and classification. TIM helps you to discover the business value of your data and enables you to leverage the power of predictive analytics. High-quality automatic feature engineering while simultaneously adapting the model structure and model parameters. TIM offers flexible deployment options. Easy integration with some of your favorite platforms. TIM offers a wide array of interfaces. Users looking for a streamlined graphical interface can find this in TIM Studio. Become truly data-driven with powerful, automated predictive analytics. Discover the predictive value in your data faster and easier.
    Starting Price: €3.20 per month
  • 19
    BigObject

    BigObject

    BigObject

    At the heart of our innovation is in-data computing, a technology designed to process large amounts of data efficiently. Our flagship product, BigObject, embodies this core technology; it’s a time series database developed with the goal of high-speed storage and handling of massive data. With our core technology of in-data computing, we launched BigObject, which can quickly and continuously handle non-stop and all aspects of data streams. BigObject is a time series database developed with the goal of high-speed storage and analysis of massive data. It boasts excellent performance and powerful complex query capabilities. Extending the relational data structure to a time-series model structure, it utilizes in-data computing to optimize the database’s performance. Our core technology is an abstract model in which all data is kept in an infinite and persistent memory space for both storage and computing.
  • 20
    Tiger Data

    Tiger Data

    Tiger Data

    Tiger Data is the creator of TimescaleDB, the world’s leading PostgreSQL-based time-series and analytics database. It provides a modern data platform purpose-built for developers, devices, and AI agents. Designed to extend PostgreSQL beyond traditional limits, Tiger Data offers built-in primitives for time-series data, search, materialization, and scale. With features like auto-partitioning, hybrid storage, and compression, it helps teams query billions of rows in milliseconds while cutting infrastructure costs. Tiger Cloud delivers these capabilities as a fully managed, elastic environment with enterprise-grade security and compliance. Trusted by innovators like Cloudflare, Toyota, Polymarket, and Hugging Face, Tiger Data powers real-time analytics, observability, and intelligent automation across industries.
    Starting Price: $30 per month
  • 21
    Autobox

    Autobox

    Automatic Forecasting Systems

    Autobox is simply the easiest way to forecast. Designed with both the novice and expert forecaster in mind you can load your data and forecast like a Pro. No matter what method you currently use to forecast, Autobox will improve your ability to forecast accurately. Autobox won the prestigious “best-dedicated forecasting program” in the Principles of Forecasting textbook and is now a website. AFS’s unique approach doesn’t try to shoehorn the data into a model or a limited number of models, allowing Autobox to combine, history and causal are in an optimal way incorporating when needed level shifts, local time trends, pulses, and seasonal pulses. Autobox discovers new causal variables by gleaning patterns from historical forecast errors and outliers identified by the Autobox engine! Many cases result in causal variables you may not have even known existed. i.e. promotions, holidays, day of the week effects, and many others.
  • 22
    Aquatic Informatics

    Aquatic Informatics

    Aquatic Informatics

    Aquatic Informatics provides software solutions that address critical water data management, analytics, and compliance challenges for the rapidly growing water industry. From raindrop to sewer discharge, our interconnected data management platforms drive the efficient management of water information to protect human health and reduce environmental impact. AQUARIUS is analytics software for the natural environment for acquiring, processing, modelling, and publishing water information in real time to enable agencies with efficient, accurate, and defensible water data. The AQUARIUS suite includes: - AQUARIUS Time-Series for Efficient, Accurate & Defensible Water Data - AQUARIUS Samples for easy Sample Management - AQUARIUS WebPortal for Real-time Data Publishing Online - AQUARIUS Forecast for Simpler Workflows with Complex Models. - AQUARIUS EnviroSCADA for Real-Time Water Data Acquisition. - AQUARIUS Cloud for All the Power of AQUARIUS in a SaaS environment.
  • 23
    Dewesoft Historian
    Historian is a database software service for long-term and permanent monitoring. It provides storage in an InfluxDB time-series database for long-term and permanent monitoring applications. Monitor your vibration, temperature, inclination, strain, pressure, and other data with a self-hosted or fully cloud-managed service. Standard OPC UA protocol is supported for data access and integration into our DewesoftX data acquisition software or SCADAs, ERPs, or any other OPC UA clients. Data is stored in a state-of-the-art open-source InfluxDB database. InfluxDB is an open-source time-series database developed by InfluxData. It is written in Go and optimized for fast, high-availability storage and retrieval of time series data in fields such as operations monitoring, application metrics, Internet of Things sensor data, and real-time analytics. Historian service can either be installed locally on the measurement unit, or your local intranet, or we can provide a fully cloud-managed service.
  • 24
    kdb Insights
    kdb Insights is a cloud-native, high-performance analytics platform designed for real-time analysis of both streaming and historical data. It enables intelligent decision-making regardless of data volume or velocity, offering unmatched price and performance, and delivering analytics up to 100 times faster at 10% of the cost compared to other solutions. The platform supports interactive data visualization through real-time dashboards, facilitating instantaneous insights and decision-making. It also integrates machine learning models to predict, cluster, detect patterns, and score structured data, enhancing AI capabilities on time-series datasets. With supreme scalability, kdb Insights handles extensive real-time and historical data, proven at volumes of up to 110 terabytes per day. Its quick setup and simple data intake accelerate time-to-value, while native support for q, SQL, and Python, along with compatibility with other languages via RESTful APIs.
  • 25
    EDAMS Environment & Government

    EDAMS Environment & Government

    Hydro-Comp Enterprises

    Our Environmental Management system is ideal for Agriculture, Natural Resources and Environment Ministries. It manages Geospatial Info, time-series measurements, licenses, permits, applications as well as quality data for water, ground & air. The EDAMS Government Environmental Management system is ideal for Agriculture, Natural Resources and Environment Ministries. It manages Geospatial Info, time-series measurements, licenses, permits, applications as well as quality data for water, ground & air. Integration is enabled on a database level, business process and transaction levels avoiding data duplication and enabling demand management. EDAMS products have their own embedded GIS but also provide direct links to ESRI ArcGIS, Quantum GIS (QGIS) and SuperMap GIS. The EDAMS system is a modular system allowing it scalable implementation to suit the growth and capacity enhancement of the organisation.
  • 26
    dataPARC Historian
    Get better performance from your time-series data with the dataPARC Historian, an enterprise-grade solution designed to elevate industrial data operations to new heights. The dataPARC Historian simplifies complex data sharing, ensuring seamless collaboration across departments with enhanced security and performance. Its compatibility with external AI, ML, and cloud applications via an open architecture increases adaptability and strategic insights. Experience rapid data access, enriched manufacturing intelligence, and a platform that grows with your business needs. dataPARC historian is the smart choice for enterprises aiming for the forefront of operational excellence. The dataPARC historian is more than a data historian; it's a gateway to unlocking greater flexibility and efficiency in leveraging your time-series data. It's crafted for those who demand reliability, speed, and intuitive use, making every data interaction impactful.
  • 27
    HEAVY.AI

    HEAVY.AI

    HEAVY.AI

    HEAVY.AI is the pioneer in accelerated analytics. The HEAVY.AI platform is used in business and government to find insights in data beyond the limits of mainstream analytics tools. Harnessing the massive parallelism of modern CPU and GPU hardware, the platform is available in the cloud and on-premise. HEAVY.AI originated from research at Harvard and MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Expand beyond the limitations of traditional BI and GIS by leveraging the full power of modern GPU and CPU hardware so you can extract decision-quality information from your massive datasets without lag. Unify and explore your largest geospatial and time-series datasets to get the complete picture of the what, when, and where. Combine interactive visual analytics, hardware-accelerated SQL, and an advanced analytics & data science framework to find opportunity and risk hidden in your enterprise when you need to most.
  • 28
    Yottamine

    Yottamine

    Yottamine

    Our highly innovative machine learning technology is designed specifically to accurately predict financial time series where only a small number of training data points are available. Advance AI is computationally consuming. YottamineAI leverages the cloud to eliminate the need to invest time and money on managing hardware, shortening the time to benefit from higher ROI significantly. Strong encryption and protection of keys ensure trade secrets stay safe. We follow the best practices of AWS and utilize strong encryption to secure your data. We evaluate how your existing or future data can generate predictive analytics in helping you make information-based decisions. If you need predictive analytics on a project basis, Yottamine Consulting Services provides project-based consulting to accommodate your data-mining needs.
  • 29
    Azure AI Metrics Advisor
    Embed AI-powered monitoring features to stay one step ahead of incidents, no machine learning expertise is required. Monitor the performance of your organization's growth engines, including sales revenue and manufacturing operations, with Azure AI Metrics Advisor, built on AI Anomaly Detector, and a part of Azure AI Services. Quickly identify and fix problems through a powerful combination of monitoring in near-real time, adapting models to your scenario, and offering granular analysis with diagnostics and alerting. End-to-end management of data monitoring is easily done through the AI Metrics Advisor interface, which plugs into popular time-series databases and provides stream monitoring support. All dimension combinations are analyzed to pinpoint affected areas for root-cause analysis and diagnoses and send alerts. A guided autotuning experience helps you customize the service to your unique needs.
    Starting Price: $0.75 per 1,000 time series
  • 30
    Amazon Timestream
    Amazon Timestream is a fast, scalable, and serverless time series database service for IoT and operational applications that makes it easy to store and analyze trillions of events per day up to 1,000 times faster and at as little as 1/10th the cost of relational databases. Amazon Timestream saves you time and cost in managing the lifecycle of time series data by keeping recent data in memory and moving historical data to a cost optimized storage tier based upon user defined policies. Amazon Timestream’s purpose-built query engine lets you access and analyze recent and historical data together, without needing to specify explicitly in the query whether the data resides in the in-memory or cost-optimized tier. Amazon Timestream has built-in time series analytics functions, helping you identify trends and patterns in your data in near real-time.
  • 31
    Seeq

    Seeq

    Seeq Corporation

    Seeq is the first application dedicated to process data analytics. Search your data, add context, cleanse, model, find patterns, establish boundaries, monitor assets, collaborate in real time, and interact with time series data like never before. Whatever your process historian or operational data system of record – the OSIsoft® PI System®, Honeywell's Uniformance® PHD, Emerson DeltaV and Ovation, Inductive Automation's Ignition, AspenTech IP.21, Wonderware, GE Proficy or any other – Seeq can connect and get you working in minutes. In the current hype around predictive analytics, machine learning, and data science, what’s missing are solutions to the real challenges to an analytics-driven organization. Tapping the expertise of your current employees. Support for collaboration and knowledge capture to foster sharing and reuse of analytics efforts. And the ability to rapidly distribute insights to the people who need them to quickly improve outcomes.
    Starting Price: $1000.00/year/user
  • 32
    Bloomfilter

    Bloomfilter

    Bloomfilter

    The first step to improving a process is to understand it. Bloomfilter uses process mining to analyze time-series data from your entire stack and build a model of your team's unique software development process – and then helps you optimize it. Visualize cost breakdown between build, run, and maintain. While software development is both an art and a science, data-driven teams ship more often and innovate faster. Translate sprints and points into dollars and cents. Understand where your process breaks down, and get best practices on how to improve it. Increase predictability when scoping work and estimating when new features will ship. Get objective analysis on your development process in a language everyone understands.
  • 33
    Prescient AI

    Prescient AI

    Prescient AI

    Uncover under-performing campaigns so you can quickly shift ad spend based on your simulation results and maintain your overall budget. We use a probabilistic and windowless measurement that leverages time-series data from your ecommerce platform and marketing channels to get to the heart of what moved the needle. Using your brand’s historical data, forecast the revenue and ROAS of your campaigns at any adjusted spend amount before you change anything. Connect all of your channels in just 10 minutes with our easy, one-time onboarding. You’ll get actionable insights 48 hours later. Marketers can quantify and benchmark the impact of cross-channel awareness on organic search, direct traffic, and other campaigns via.
  • 34
    Uptimon

    Uptimon

    Uptimon

    Uptimon is a monitoring SaaS that tracks website and API availability 24/7, detecting outages within seconds and sending instant alerts. It performs HTTP(S) pings at customizable intervals (30 seconds to 60 minutes), checking response time, status codes, keyword presence, and SSL certificate validity. When a site goes down, it instantly notifies via email, Discord, Telegram, webhook, or SMS.The dashboard displays real-time status, last 30-day uptime percentage, average response times, and performance graphs. Technical stack includes Node.js/Go backend, Redis queue, horizontal scaling support, and time-series databases like TimescaleDB.Target audience ranges from freelancers to enterprises. Key differentiators include simple setup (3 clicks), fast detection time, and affordable pricing. Uses freemium model with paid tiers based on monitor count and check frequency. Growth features include shareable status badges, public status pages, and partner integrations with Vercel/Netlify.
    Starting Price: $9/month
  • 35
    OpenTSDB

    OpenTSDB

    OpenTSDB

    OpenTSDB consists of a Time Series Daemon (TSD) as well as set of command line utilities. Interaction with OpenTSDB is primarily achieved by running one or more of the independent TSDs. There is no master, no shared state so you can run as many TSDs as required to handle any load you throw at it. Each TSD uses the open source database HBase or hosted Google Bigtable service to store and retrieve time-series data. The data schema is highly optimized for fast aggregations of similar time series to minimize storage space. Users of the TSD never need to access the underlying store directly. You can communicate with the TSD via a simple telnet-style protocol, an HTTP API or a simple built-in GUI. The first step in using OpenTSDB is to send time series data to the TSDs. A number of tools exist to pull data from various sources into OpenTSDB.
  • 36
    FactoryTalk Historian

    FactoryTalk Historian

    Rockwell Automation

    It is time to retire the warn-out clip boards and tedious transcription of critical plant performance data. FactoryTalk® Historian software captures operational process data from multiple sources at lightning speed. You gain a supreme level of supervisory control, performance monitoring and quality assurance with robust FactoryTalk Historian software that can scale from machine to enterprise. Reliably recording time-series records at this pace would be impossible, even for your most caffeinated plant-floor record keeper. FactoryTalk Historian dashboards take that work off your hands. Plus, the added confidence in forecasting trends with reliable data will generate new level of productivity. With FactoryTalk Historian Site Edition (SE), data across your plant and enterprise can’t hide. Redundancy and high availability ensures continuous access to key plant data.
  • 37
    Kibana

    Kibana

    Elastic

    Kibana is a free and open user interface that lets you visualize your Elasticsearch data and navigate the Elastic Stack. Do anything from tracking query load to understanding the way requests flow through your apps. Kibana gives you the freedom to select the way you give shape to your data. With its interactive visualizations, start with one question and see where it leads you. Kibana core ships with the classics: histograms, line graphs, pie charts, sunbursts, and more. And, of course, you can search across all of your documents. Leverage Elastic Maps to explore location data, or get creative and visualize custom layers and vector shapes. Perform advanced time series analysis on your Elasticsearch data with our curated time series UIs. Describe queries, transformations, and visualizations with powerful, easy-to-learn expressions.
  • 38
    KDB.AI

    KDB.AI

    KX Systems

    KDB.AI is a powerful knowledge-based vector database and search engine that allows developers to build scalable, reliable and real-time applications by providing advanced search, recommendation and personalization for AI applications. Vector databases are a new wave of data management designed for generative AI, IoT and time-series applications. Here's why they matter, what makes them different, how they work, the new use cases they're designed for, and how to get started.
  • 39
    AI CERTs

    AI CERTs

    AI CERTs

    AI CERTs offers role-based, industry-relevant certification programs in artificial intelligence and blockchain, designed to make AI education accessible to both technical and non-technical individuals. It provides a broad catalog of learning paths and credentials. Among its offerings is the “AI+ Developer” certification, which guides learners through Python programming, data processing, machine learning, deep learning, NLP, computer vision, reinforcement learning, time-series analysis, model explainability, and cloud deployment, providing hands-on projects, labs, and an online proctored exam to validate competence. It also supports flexible modes of learning: self-paced or instructor-led, enabling learners to adapt their study to their schedule. AI CERTs aims to close the AI skills gap by offering updated curricula designed by industry and academic experts, continuously revised to match emerging industry needs.
    Starting Price: Free
  • 40
    ZeusDB

    ZeusDB

    ZeusDB

    ZeusDB is a next-generation, high-performance data platform designed to handle the demands of modern analytics, machine learning, real-time insights, and hybrid data workloads. It supports vector, structured, and time-series data in one unified engine, allowing recommendation systems, semantic search, retrieval-augmented generation pipelines, live dashboards, and ML model serving to operate from a single store. The platform delivers ultra-low latency querying and real-time analytics, eliminating the need for separate databases or caching layers. Developers and data engineers can extend functionality with Rust or Python logic, deploy on-premises, hybrid, or cloud, and operate under GitOps/CI-CD patterns with observability built in. With built-in vector indexing (e.g., HNSW), metadata filtering, and powerful query semantics, ZeusDB enables similarity search, hybrid retrieval, filtering, and rapid application iteration.
  • 41
    Altair Panopticon
    Altair Panopticon Streaming Analytics lets business users and engineers — the people closest to the action — build, modify, and deploy sophisticated event processing and data visualization applications with a drag-and-drop interface. They can connect to virtually any data source, including real-time streaming feeds and time-series databases, develop complex stream processing programs, and design visual user interfaces that give them the perspectives they need to make insightful, fully-informed decisions based on massive amounts of fast-changing data.
    Starting Price: $1000.00/one-time/user
  • 42
    Oxla

    Oxla

    Oxla

    Purpose-built for compute, memory, and storage efficiency, Oxla is a self-hosted data warehouse optimized for large-scale, low-latency analytics with robust time-series support. Cloud data warehouses aren’t for everyone. At scale, long-term cloud compute costs outweigh short-term infrastructure savings, and regulated industries require full control over data beyond VPC and BYOC deployments. Oxla outperforms both legacy and cloud warehouses through efficiency, enabling scale for growing datasets with predictable costs, on-prem or in any cloud. Easily deploy, run, and maintain Oxla with Docker and YAML to power diverse workloads in a single, self-hosted data warehouse.
    Starting Price: $50 per CPU core / monthly
  • 43
    CUBOT

    CUBOT

    Vizualytics

    We created a business intelligence platform for business users to learn from data. CUBOT Business Intelligence is a single platform to integrate your data and to provide intelligence to those that need it. Imports ETL processed tables to create data models. In CUBOT Designer, an analyst can join transaction tables with relevant master tables to connect information across silos. CUBOT is designed to facilitate the use of organizational data. It is easy to set up scheduling actions and configure metrics that will ultimately be used to track and improve business growth. Allows variables to be defined by an analyst. CUBOT Configurator performs data aggregations, calculates measures, bins dimension values, and more. Tell CUBOT here if you have geographic or time-series data, so that you can do more with it later on. View figures across data attribute and drill down to the finest level of granularity.
  • 44
    Eagle.io

    Eagle.io

    Eagle.io

    Transform your data into actionable insights with eagle.io Designed for system integrators and consultants, eagle.io helps you turn time-series data into actionable intelligence. Acquire data in real-time from any data logger or text file, transform data automatically using processing and logic, receive alerts for critical events, and share access with your clients. That’s why eagle.io is trusted by some of the world’s biggest companies to help them better understand their natural assets and environmental conditions in real time.
  • 45
    Plutoshift

    Plutoshift

    Plutoshift

    Plutoshift's Operational Data Platform (ODP) delivers automated performance monitoring and predictive analysis for physical infrastructure to drive efficiency, reduce cost and waste, and help build more resilient sustainable businesses. Extracting and processing raw data for analysis is often the biggest challenge to delivering value. Plutoshift ingests, processes, and curates all your infrastructure, business, and even third-party data, creating a foundation for generating insights. We automate and maintain the entire data pipeline without the need for additional IT resources. Plutoshift provides organizations with an always-on view of their operations at the workflow, site, and enterprise level. Operators have an immediate view of trends and KPIs for every process and asset. With configurable alerts, response tracking, and contextual reports, teams can quickly identify problems, prevent downtime, and uncover the root cause of issues.
  • 46
    Waylay

    Waylay

    Waylay

    Modular IoT platform providing best-of-breed OEM technology for back-end development and operations, enabling accelerated IoT solution delivery at scale. Advanced rule logic modeling, execution and lifecycle management. Automate any data workflow, from the simple to the complex. The Waylay platform is built from the ground up to natively cope with the multiple data patterns of IoT, OT and IT. Leverage streaming and time series analytics within the same collaborative intelligence platform. Accelerate the time to market of your IoT solutions by easily delivering self-service and KPI-centric apps to non-developer teams. Find out what automation tools are best suited to your IoT use case, then test them against the benchmark. IoT application development is fundamentally different from “normal” IT development. It requires bridging the physical world of Operations Technology (OT) with sensors, actuators and gateways to the digital world of Information Technology (IT) with databases.
  • 47
    Solcast

    Solcast

    Solcast

    Solcast is a cloud-based solar and weather data software built to provide high-resolution, bankable historical, live, and forecast solar irradiance, weather, and photovoltaic (PV) power data for renewable energy applications, delivered through a scalable REST API and developer tools. It tracks real-time satellite cloud cover and blends that with advanced weather models to deliver solar irradiance forecasts from five minutes up to 14 days ahead with high spatial and temporal resolution, and also supplies historical time series from 2007 to seven days ago for performance analysis. Solcast supports multiple PV power forecast models (Rooftop PV, Advanced PV, and Premium PV) that estimate actuals and future output for solar assets of any scale, and its API returns data in JSON or CSV formats suitable for integration into energy software, analytics platforms, or custom workflows. Developers can access this data using native HTTP clients or SDKs (such as Python and C#) with code examples.
  • 48
    Proficy Historian
    Proficy Historian is a best-in-class historian software solution that collects industrial time-series and A&E data at very high speed, stores it efficiently and securely, distributes it, and allows for fast retrieval and analysis —driving greater business value. With decades of experience and thousands of successful customer installations around the world, Proficy Historian changes the way companies perform and compete by making data available for asset and process performance analysis. The most recent Proficy Historian enhances usability, configurability and maintainability with significant architectural improvements. Take advantage of the solution’s simple yet powerful features to unlock new value from your equipment, process data, and business models. Remote collector management with UX. Horizontal scalability that enables enterprise-wide data visibility.
  • 49
    Sider Scan

    Sider Scan

    Sider Scan

    Sider Scan is a lightning-fast duplicate code detection tool for software developers that finds and continuously monitors problems with code duplication. GitLab CI/CD, GitHubActions, Jenkins & CircleCI® integration. Installation using a Docker image. Easy team sharing of the analysis details. Continuous and fast analysis that runs in the background. Dedicated product support via email and phone. Sider Scan enhances long-term code quality and maintenance processes with in-depth duplicate code analysis. It's designed to complement other analysis tools, helping teams to produce cleaner code, and supporting continuous delivery. Sider finds duplicate blocks of code in your project and groups them. For each pair of duplicates, a diff library is created and pattern analyses are initiated to determine if there are any problems. This is referred to as the 'pattern' method of analysis. Time-series analysis is only possible when the scan is consistently run at regular intervals.
  • 50
    Narrator

    Narrator

    Narrator

    Narrator is a fundamentally new approach to data, answering any question without having to build a new model or update SQL. Other data companies help you build and manage models and tables. Good for them, but we eliminate that need altogether. Narrator models each business concept once from any source in your warehouse, in a single time-series table. Each takes five minutes and once built, does not have to change for new questions. New data questions or metrics never require new data prep. Every business concept can be queried, plotted, explored, and related together, all without SQL, using any data in your warehouse. Security at Narrator is not an afterthought, we design and build for it through every stage of our process. Encourage your team to ask an infinite number of follow-up questions, creating a culture that feeds curiosity, that liberates the business to ask, iterate, and innovate.