Compare the Top API Gateways that integrate with Hadoop as of November 2024

This a list of API Gateways that integrate with Hadoop. Use the filters on the left to add additional filters for products that have integrations with Hadoop. View the products that work with Hadoop in the table below.

What are API Gateways for Hadoop?

An API gateway is the server or node located between client and backend services or microservices endpoints. API gateways are used for API management, with the API gateway managing and processing API requests that come in via microservices. Another way to think of an API gateway is that it is the software layer that lives in front of an API interface, which handles API requests. Compare and read user reviews of the best API Gateways for Hadoop currently available using the table below. This list is updated regularly.

  • 1
    SnapLogic

    SnapLogic

    SnapLogic

    Quickly ramp up, learn and use SnapLogic to create, multi-point, enterprise- wide app and data integrations. Easily expose and manage pipeline APIs that extend your world. Eliminate slower, manual, error-prone methods and deliver faster results for business processes such as customer onboarding, employee onboarding and off-boarding, quote to cash, ERP SKU forecasting, support ticket creation, and more. Monitor, manage, secure, and govern your data pipelines, application integrations, and API calls––all from a single pane of glass. Launch automated workflows for any department, across your enterprise, in minutes – not days. To deliver superior employee experiences, the SnapLogic platform can bring together employee data across all your enterprise HR apps and data stores. Learn how SnapLogic can help you quickly set up seamless experiences powered by automated processes.
  • 2
    Apache Knox

    Apache Knox

    Apache Software Foundation

    The Knox API Gateway is designed as a reverse proxy with consideration for pluggability in the areas of policy enforcement, through providers and the backend services for which it proxies requests. Policy enforcement ranges from authentication/federation, authorization, audit, dispatch, hostmapping and content rewrite rules. Policy is enforced through a chain of providers that are defined within the topology deployment descriptor for each Apache Hadoop cluster gated by Knox. The cluster definition is also defined within the topology deployment descriptor and provides the Knox Gateway with the layout of the cluster for purposes of routing and translation between user facing URLs and cluster internals. Each Apache Hadoop cluster that is protected by Knox has its set of REST APIs represented by a single cluster specific application context path. This allows the Knox Gateway to both protect multiple clusters and present the REST API consumer with a single endpoint.
  • Previous
  • You're on page 1
  • Next