Best Data Management Software - Page 99

Compare the Top Data Management Software as of February 2026 - Page 99

  • 1
    Qlik Gold Client
    Qlik Gold Client improves the efficiency, cost and security of managing test data in SAP environments. Qlik Gold Client is designed to eliminate development workarounds by easily moving configuration, master, and transactional data subsets into testing environments. Rapidly define, copy and synchronize transactional data from production to non-production targets. Identify, select, and delete non-production data. Manage extensive and powerful data transformations through a clean and easy-to-use interface. Automate data selection and enable hands-free test data refresh cycles, reducing time and effort for test data management. Qlik Gold Client provides several options to protect PII data in non-production environments via data masking. Data masking applies a set of rules to “scramble” your production data when it’s replicated to a non-production environment.
  • 2
    Tungsten Transformation

    Tungsten Transformation

    Tungsten Automation

    Classify large volumes of documents and accurately extract information. Tungsten Transformation accelerates business processes by replacing manual document classification, separation and extraction with touchless processing, speeding you along on your digital workflow transformation journey. Automate the understanding of any document type and the data on those documents for later processing or storage. Realize efficiencies in document capture processes and avoid costly integrations utilizing the Tungsten Capture and Tungsten Transformation system. Increase productivity and accelerate business processes by removing the need for manual document classification, separation and extraction. Process more transactions easily and efficiently and improve the flow of information throughout your organization.
  • 3
    Cloudera Data Platform
    Unlock the potential of private and public clouds with the only hybrid data platform for modern data architectures with data anywhere. Cloudera is a hybrid data platform designed for unmatched freedom to choose—any cloud, any analytics, any data. Cloudera delivers faster and easier data management and data analytics for data anywhere, with optimal performance, scalability, and security. With Cloudera you get all the advantages of private cloud and public cloud for faster time to value and increased IT control. Cloudera provides the freedom to securely move data, applications, and users bi-directionally between the data center and multiple data clouds, regardless of where your data lives.
  • 4
    Datametica

    Datametica

    Datametica

    At Datametica, our birds with unprecedented capabilities help eliminate business risks, cost, time, frustration, and anxiety from the entire process of data warehouse migration to the cloud. Migration of existing data warehouse, data lake, ETL, and Enterprise business intelligence to the cloud environment of your choice using Datametica automated product suite. Architecting an end-to-end migration strategy, with workload discovery, assessment, planning, and cloud optimization. Starting from discovery and assessment of your existing data warehouse to planning the migration strategy – Eagle gives clarity on what’s needed to be migrated and in what sequence, how the process can be streamlined, and what are the timelines and costs. The holistic view of the workloads and planning reduces the migration risk without impacting the business.
  • 5
    Varada

    Varada

    Varada

    Varada’s dynamic and adaptive big data indexing solution enables to balance performance and cost with zero data-ops. Varada’s unique big data indexing technology serves as a smart acceleration layer on your data lake, which remains the single source of truth, and runs in the customer cloud environment (VPC). Varada enables data teams to democratize data by operationalizing the entire data lake while ensuring interactive performance, without the need to move data, model or manually optimize. Our secret sauce is our ability to automatically and dynamically index relevant data, at the structure and granularity of the source. Varada enables any query to meet continuously evolving performance and concurrency requirements for users and analytics API calls, while keeping costs predictable and under control. The platform seamlessly chooses which queries to accelerate and which data to index. Varada elastically adjusts the cluster to meet demand and optimize cost and performance.
  • 6
    Data Lakes on AWS
    Many Amazon Web Services (AWS) customers require a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is a new and increasingly popular way to store and analyze data because it allows companies to manage multiple data types from a wide variety of sources, and store this data, structured and unstructured, in a centralized repository. The AWS Cloud provides many of the building blocks required to help customers implement a secure, flexible, and cost-effective data lake. These include AWS managed services that help ingest, store, find, process, and analyze both structured and unstructured data. To support our customers as they build data lakes, AWS offers the data lake solution, which is an automated reference implementation that deploys a highly available, cost-effective data lake architecture on the AWS Cloud along with a user-friendly console for searching and requesting datasets.
  • 7
    Infor Data Lake
    Solving today’s enterprise and industry challenges requires big data. The ability to capture data from across your enterprise—whether generated by disparate applications, people, or IoT infrastructure–offers tremendous potential. Infor’s Data Lake tools deliver schema-on-read intelligence along with a fast, flexible data consumption framework to enable new ways of making key decisions. With leveraged access to your entire Infor ecosystem, you can start capturing and delivering big data to power your next generation analytics and machine learning strategies. Infinitely scalable, the Infor Data Lake provides a unified repository for capturing all of your enterprise data. Grow with your insights and investments, ingest more content for better informed decisions, improve your analytics profiles, and provide rich data sets to build more powerful machine learning processes.
  • 8
    AWS Lake Formation
    AWS Lake Formation is a service that makes it easy to set up a secure data lake in days. A data lake is a centralized, curated, and secured repository that stores all your data, both in its original form and prepared for analysis. A data lake lets you break down data silos and combine different types of analytics to gain insights and guide better business decisions. Setting up and managing data lakes today involves a lot of manual, complicated, and time-consuming tasks. This work includes loading data from diverse sources, monitoring those data flows, setting up partitions, turning on encryption and managing keys, defining transformation jobs and monitoring their operation, reorganizing data into a columnar format, deduplicating redundant data, and matching linked records. Once data has been loaded into the data lake, you need to grant fine-grained access to datasets, and audit access over time across a wide range of analytics and machine learning (ML) tools and services.
  • 9
    Oracle Cloud Infrastructure Data Lakehouse
    A data lakehouse is a modern, open architecture that enables you to store, understand, and analyze all your data. It combines the power and richness of data warehouses with the breadth and flexibility of the most popular open source data technologies you use today. A data lakehouse can be built from the ground up on Oracle Cloud Infrastructure (OCI) to work with the latest AI frameworks and prebuilt AI services like Oracle’s language service. Data Flow is a serverless Spark service that enables our customers to focus on their Spark workloads with zero infrastructure concepts. Oracle customers want to build advanced, machine learning-based analytics over their Oracle SaaS data, or any SaaS data. Our easy- to-use data integration connectors for Oracle SaaS, make creating a lakehouse to analyze all data with your SaaS data easy and reduces time to solution.
  • 10
    LeadEnrich

    LeadEnrich

    LeadEnrich

    LeadEnrich helps companies build and enrich prospect data. Don't get locked in annual contracts with data providers. LeadEnrich is the easiest way to build validated prospect lists and clean up customer data. Our real-time list building and enrichment solution is a pay-as-you-go model. LeadEnrich saves teams hundreds of hours in manually building and cleaning prospecting lists.
    Starting Price: $299 per month
  • 11
    Hammerspace

    Hammerspace

    Hammerspace

    Hammerspace is a revolutionary storage platform that unlocks unused local NVMe storage in GPU servers to accelerate AI training and checkpointing. It transforms siloed, stranded storage into a shared, ultra-fast tier that dramatically increases GPU utilization and reduces the need for costly external storage systems. By using a standards-based parallel file system, Hammerspace delivers low-latency, high-throughput data access that scales to thousands of GPU servers. The platform helps cut power consumption and infrastructure costs while boosting AI workload performance. Leading organizations like Meta rely on Hammerspace to optimize their AI infrastructure. With easy deployment and rapid scaling, Hammerspace enables teams to get AI models trained faster and more efficiently.
  • 12
    NavigatorCRE

    NavigatorCRE

    NavigatorCRE

    View your portfolio of assets from 30,000ft down to a single-tenant lease. Connect PM data and asset data seamlessly. Understand your global footprint, location intelligence, options, expirations and headcount. Manage critical dates, headcount and upcoming obligations. A comprehensive database for your deals, comps, market intelligence with pitch tools and client-driven lease administration. Track current projects and future sites alongside market trends and analytics to have the full view of your projects throughout the build process. The Commercial Real Estate business is one of the oldest, most profitable, expansive and important industries in the world. However, it is wrought with provinciality and limited innovation which has been to its detriment. But it's not CRE's fault necessarily, the technology industry has not serviced the industry with enterprise platforms that can take on the immense amounts of property, tenant, transaction and operational data.
  • 13
    Lido

    Lido

    Lido

    Connect, analyze, and visualize all of your data in a single spreadsheet. Connect your data with clicks, not code. Easily join and populate company data from 20+ of the most popular databases and SaaS applications, including Facebook, Google, and Snowflake. Create dashboards you want to share. Say goodbye to ugly charts and hours of formatting. Instantly tie data across multiple sources. Just like Excel & Google Sheets. Slice and dice your data into different views. Build a dashboard in less time than it takes to create a Jira ticket. We proxy the requests sent to external databases to monitor things and send your credentials securely over the server-side. None of the data returned by your database or third-party SaaS integrations are stored on Lido's servers. We encrypt all data, which means your in-transit data is encrypted with TLS and your at-rest data is encrypted with AES-256 making it unreadable to outside people.
  • 14
    doolytic

    doolytic

    doolytic

    doolytic is leading the way in big data discovery, the convergence of data discovery, advanced analytics, and big data. doolytic is rallying expert BI users to the revolution in self-service exploration of big data, revealing the data scientist in all of us. doolytic is an enterprise software solution for native discovery on big data. doolytic is based on best-of-breed, scalable, open-source technologies. Lightening performance on billions of records and petabytes of data. Structured, unstructured and real-time data from any source. Sophisticated advanced query capabilities for expert users, Integration with R for advanced and predictive applications. Search, analyze, and visualize data from any format, any source in real-time with the flexibility of Elastic. Leverage the power of Hadoop data lakes with no latency and concurrency issues. doolytic solves common BI problems and enables big data discovery without clumsy and inefficient workarounds.
  • 15
    DryvIQ

    DryvIQ

    DryvIQ

    Gain deep and robust insight into your unstructured enterprise data to gauge risk, mitigate threats and vulnerabilities, while enabling better business decisions. Classify, label and organize unstructured data at enterprise scale. Enable rapid, accurate and detailed identification of sensitive and high-risk files and provide deep insight via A.I. Enable continuous visibility across both new and existing unstructured data. Enforce policy, compliance and governance decisions without reliance upon manual input from users. Expose dark data while automatically classifying and organizing sensitive and other content groups at scale—so you can make intelligent decisions on where and how to migrate that data. The platform also enables both simple and advanced file transfers across virtually any cloud service, network file system or legacy ECM platform, at scale.
  • 16
    Salt

    Salt

    Salt Security

    The Salt Security API Security Platform protects APIs across their full lifecycle – build, deploy and runtime phases. Only Salt can capture and baseline all API traffic -- all calls and responses -- over days, weeks, even months. Salt uses this rich context to detect the reconnaissance activity of bad actors and block them before they can reach their objective. The Salt API Context Engine (ACE) architecture discovers all APIs, pinpoints and stops API attackers, and provides remediation insights learned during runtime to harden APIs. Only Salt applies cloud-scale big data to address API security challenges. Salt applies its AI and ML algorithms, which have been in the market for more than four years, to provide real-time analysis and correlation across billions of API calls. That level of context is essential for rich discovery, accurate data classification, and the ability to identify and stop “low and slow” API attacks, which occur over time. On prem solutions simply lack the data.
  • 17
    reciTAL

    reciTAL

    reciTAL

    reciTAL is an Artificial Intelligence software editor. First Intelligent Document Processing player with a Deep Tech label, reciTAL automates your extraction, classification and search processes, for all types of document and email flows. At any time, you can re-train a model taking into consideration user feedback. The reciTAL team guides you through deployment in your internal Kubernetes or via Docker Compose. Basic business rules are then implemented in a few minutes to configure your data points. Depending on the level of confidence reached, the extracted data are validated or not by an operator. The configuration of a new type of document is done with unparalleled simplicity and speed. Validated data is used for continuous performance improvement.
  • 18
    SHREWD Platform

    SHREWD Platform

    Transforming Systems

    Harness your whole system’s data with ease, with our SHREWD Platform tools and open APIs. SHREWD Platform provides the integration and data collection tools the SHREWD modules operate from. The tools aggregate data, storing it in our secure, UK-based data lake. This data is then accessed by the SHREWD modules or an API, to transform the data into meaningful information with targeted functions. Data can be ingested by SHREWD Platform in almost any format, from analog in spreadsheets, to digital systems via APIs. The system’s open API can also allow third-party connections to use the information held in the data lake, if required. SHREWD Platform provides an operational data layer that is a single source of the truth in real-time, allowing the SHREWD modules to provide intelligent insights, and managers and key decision-makers to take the right action at the right time.
  • 19
    PILTYIX

    PILTYIX

    PILTYIX

    An AI technology company dedicated to solutions that generate revenue, save time, and reduce costs for universities and sports & entertainment organizations. Sales teams exist to maximize your profitability. In an age of spam blockers, reduced budgets, and workforce reductions, teams need an edge to stay motivated and ensure the most motivated and at-risk customers don’t get lost in an unprioritized pile of leads. PILTYIX sports and entertainment sales solutions ensure your teams are concentrating on the right business at the right time, maintaining their maximum efficiency and garnering the highest possible revenue in the shortest period of time. Your best sales prospects are constantly moving targets, their interest levels vacillate based on team performance, life circumstances, and financial priorities. These real-time data changes influence your prospect’s likelihood to purchase.
  • 20
    Amadea

    Amadea

    ISoft

    Amadea technology relies on the fastest real-time calculation and modeling engine on the market. Speed up the creation, deployment and automation of your analytics projects within the same integrated environment. Data quality is the key to analytical projects. Thanks to the ISoft real-time calculation engine, the fastest on the market, Amadea allows companies to prepare and use massive and/or complex data in real-time, regardless of the volume. ISoft started from a simple observation, successful analytical projects must involve the business users at every stage. Founded on a no-code interface, accessible to all types of users, Amadea allows everyone involved in analytical projects to take part. As Amadea has the fastest real-time calculation engine on the market, it lets you specify, prototype and build your data applications simultaneously. Amadea incorporates the fastest real-time data analysis engine on the market, 10 million lines per second & per core for standard calculations.
  • 21
    IBM InfoSphere Optim Data Privacy
    IBM InfoSphere® Optim™ Data Privacy provides extensive capabilities to effectively mask sensitive data across non-production environments, such as development, testing, QA or training. To protect confidential data this single offering provides a variety of transformation techniques that substitute sensitive information with realistic, fully functional masked data. Examples of masking techniques include substrings, arithmetic expressions, random or sequential number generation, date aging, and concatenation. The contextually accurate masking capabilities help masked data retain a similar format to the original information. Apply a range of masking techniques on-demand to transform personally-identifying information and confidential corporate data in applications, databases and reports. Data masking features help you to prevent misuse of information by masking, obfuscating, and privatizing personal information that is disseminated across non-production environments.
  • 22
    IBM Sterling Fulfillment Optimizer
    IBM Sterling Fulfillment Optimizer with Watson is a cognitive analytic engine that enhances existing order management systems. It provides a “big data brain” to order management and inventory visibility systems that are already in place with retailers who have eCommerce fulfillment capability. With Fulfillment Optimizer, retailers are better able to understand and act on changes in the market as they occur, to perfectly balance protecting margins, utilizing store capacity and meeting delivery expectations. These sourcing decisions can dramatically increase profits, especially during peak periods. Know the impact of omnichannel decisions across eCommerce, merchandising, logistics, store operations and supply chain. Intelligently balance omnichannel fulfillment costs against service to protect margins, utilize store capacity and meet customer delivery expectations. Easily execute optimized omnichannel fulfillment plans at the lowest cost-to-serve.
  • 23
    IBM App Discovery Delivery Intelligence
    IBM® Application Discovery and Delivery Intelligence (ADDI) is an analytical platform for application modernization. It uses cognitive technologies to analyze mainframe applications and quickly discover and understand the interdependencies of changes. Enhance mainframe applications rapidly to help drive revenue. Gain back your investment and start reaping returns quickly. Better understand application complexity to anticipate issues and cut app/development costs. Evolve systems and applications rapidly to take advantage of hybrid cloud. Accelerate your digital transformation with application insights that increase productivity and reduce risk. Use rapid analysis capabilities to discover the relationships between the components of your IBM z/OS® applications and understand the impact of potential change. Quickly find business rules, code snippets and APIs so that you can leverage them to support your business processes.
  • 24
    IBM StoredIQ Suite
    IBM StoredIQ® Suite helps you address the problems that challenge your data discovery, records management, compliance activities, storage optimization and data migration initiatives. By providing an in-depth and in-place unstructured data assessment, this software gives organizations visibility into data to make more informed business decisions. IBM StoredIQ for Legal provides an organized, systemic approach that streamlines electronic discovery (eDiscovery) for legal stakeholders. Automate policy with IBM StoredIQ Policy, bundled with the StoredIQ Suite. In-place data management enables an organization to discover, recognize, and act on unstructured data without moving it to a repository or specialty application. StoredIQ provides a powerful search function designed to accelerate the understanding of large amounts of unstructured content. Get a simplified and detailed analysis of large amounts of corporate data.
  • 25
    IBM Storage Archive
    IBM Storage Archive software in the IBM Storage portfolio provides a graphical interface to manage data in tape drives and libraries using the Linear Tape File System (LTFS) format standard for reading, writing and exchanging descriptive metadata on formatted tape cartridges. IBM Storage Archive gives you direct, intuitive and graphical access to data stored on tape while eliminating the need for additional tape management and software to access data. Storage Archive offers three software solutions for managing your digital files with the LTFS format: Single Drive Edition, Library Edition and Enterprise Edition.
  • 26
    IBM Event Streams
    IBM Event Streams is a fully managed event streaming platform built on Apache Kafka, designed to help enterprises process and respond to real-time data streams. With capabilities for machine learning integration, high availability, and secure cloud deployment, it enables organizations to create intelligent applications that react to events as they happen. The platform supports multi-cloud environments, disaster recovery, and geo-replication, making it ideal for mission-critical workloads. IBM Event Streams simplifies building and scaling real-time, event-driven solutions, ensuring data is processed quickly and efficiently.
  • 27
    IBM Master Data Management on Cloud
    Using both IBM Master Data Management (MDM) on Cloud for customer data and for product data, you can confidently trust, govern and act upon the data with the extensibility of flexible deployment models. Work faster, scale easier and reduce costs. This fully managed offering helps you accelerate experimentation and innovation without admin and infrastructure hassles. Reduce initial cost for new projects and lower risk with a cost-effective entry point and subscription licensing options. Faster provisioning and deployment of Master Data Management let you devote more resources to new solutions and innovation. IBM Master Data Management on Cloud helps you maintain uptime with optimized, built-in high availability and improved ease of use. The MDM fully customizable, built-in hardware and software infrastructure is based on the proven IBM Spectrum® Protect technology. Move all or part of your MDM workload to the cloud.
  • 28
    IBM Db2 Event Store
    IBM Db2 Event Store is a cloud-native database system that is designed to handle massive amounts of structured data that is stored in Apache Parquet format. Because it is optimized for event-driven data processing and analysis, this high-speed data store can capture, analyze, and store more than 250 billion events per day. The data store is flexible and scalable to adapt quickly to your changing business needs. With the Db2 Event Store service, you can create these data stores in your Cloud Pak for Data cluster so that you can govern the data and use it for more in-depth analysis. You need to rapidly ingest large amounts of streaming data (up to one million inserts per second per node) and use it for real-time analytics with integrated machine learning capabilities. Analyze incoming data from different medical devices in real time to provide better health outcomes for patients while providing cost savings for moving the data to storage.
  • 29
    IBM Industry Models
    An industry data model from IBM acts as a blueprint with common elements based on best practices, government regulations and the complex data and analytic needs of the industry. A model can help you manage data warehouses and data lakes to gather deeper insights for better decisions. The models include warehouse design models, business terminology and business intelligence templates in a predesigned framework for an industry-specific organization to accelerate your analytics journey. Analyze and design functional requirements faster using industry-specific information infrastructures. Create and rationalize data warehouses using a consistent architecture to model changing requirements. Reduce risk and delivery better data to apps across the organization to accelerate transformation. Create enterprise-wide KPIs and address compliance, reporting and analysis requirements. Use industry data model vocabularies and templates for regulatory reporting to govern your data.
  • 30
    IBM Decision Optimization Center
    IBM Decision Optimization Center provides a configurable platform to support business decision-makers such as scientists, developers, analysts, planners and schedulers. It uses powerful analytics to solve tough planning and scheduling challenges, reducing the effort, time and risk associated with tailored, business improvement solutions. Deploy the platform to make smarter decisions and improve ROI by combining data and analytics with cutting-edge optimization technology. Turn data insights into business action using prescriptive analytics capabilities. Discover fast, highly reliable solutions with IBM enterprise-class optimization. Quickly enhance the application with ready-to-use components and built-in visualization. Work with an open, standards-based architecture and easily map databases into application data tables. Manage access to application functionality based on user profiles.