How API Gateways Unlock Cloud Native Development

By Community Team

This article explores the benefits and challenges of going cloud native and examines it through the lens of API gateway technology.

Why cloud native development?

Adopting cloud platforms and moving to a cloud native development paradigm introduces change and challenges to both practices and technologies that traditional software development companies employ. Moving to the cloud is often inspired by the desire to speed up deployment frequency and enable faster but safer code, test, ship and run sequences. In addition to speed (both in sharing new features with end users and in terms of developer workflows and feedback loops), the cloud allows for agility and scalability in increasingly dynamic environments. With the right solutions, development can fly in the cloud and quickly deliver tangible value, that is why we created Ambassador.

Daniel Bryant
Head of Developer Relations at Ambassador Labs

The cloud has a lot going for it, but adopting cloud native poses new challenges across roles. Architects need to understand the changes imposed by the underlying hardware and new infrastructure. Developers and QA specialists need to understand container and cloud technologies and interact with the underlying infrastructure platforms. And platform engineers need to build and operate a supporting platform to enable developers to code, test, ship, and run applications with speed and safety.

Gateway to speed: Why move to the cloud?

Establishing cloud native goals first will help you stay on track and ensure that the cloud is the right place for you to be. Most organizations’ focus these goals on improving some or all of their DORA or Accelerate metrics, which DevOps teams use to understand and measure performance. These performance metrics, i.e., deployment frequency (DF), lead time for changes (LT), mean time to recovery (MTTR), and change failure rate (CFR), underpin some of the key benefits of cloud native development.

The ability to rapidly ship new software to customers – both for feature releases and incident resolution – adds tremendous value that can be easily understood throughout the organization, from the C-level to the product, engineering and support teams. Ultimately these kinds of returns justify most organizations’ moves to the cloud.

Several elements contribute to getting the correct platform abstractions and creating a self-service mindset within development teams. In this case, we’ll look at the API gateway as one such area. In this use case, two key personas are involved: the platform engineer, who sets up the platform to minimize incidents and maximize security, and the developer, who wants to code and release services and functionality quickly while being able to configure API endpoints dynamically.

What is an API gateway?

First things first: What is an API gateway? And why are we focusing on the API gateway? For one thing, we live in an API-driven world, and the API gateway is a front door to your applications and systems. The API gateway is going to deploy, route and secure every single user request, meaning it must be highly performant, secure, reliable and easily configurable.

In parallel with the evolution of API gateway technology, cloud native practices and technologies, such as continuous delivery, Kubernetes, and HTTP/3, have emerged, ushering in the era of the cloud-native API gateway.

About Ambassador Edge Stack cloud native API Gateway

Ambassador Edge Stack meets the demand for a Kubernetes-native API gateway and was designed to simplify delivering secure, high-performance microservices traffic management at scale. It enables both the platform engineer and developer as well as the use cases they care about, and currently powers some of the world’s largest Kubernetes installations.

With Ambassador Edge Stack, we embraced the widely adopted Kubernetes Resource Model (KRM), which enables all of the API gateway functionality to be configured by Custom Resources and applied to a cluster in the same manner as any Kubernetes configuration. For example, using build pipelines or a GitOps continuous delivery process.

Platform engineers can configure the core API gateway functionality using resources like Listener, Host, and TLSContext. They can also provide a range of authentication and authorization options (using OIDC, JWT, etc.) and rate limiting using the Filter resources.

Independently from this – although appropriately coupled at runtime – developers can launch new services and APIs using the Mapping resource. They can also augment their API endpoints with required authn/authz policy and rate limiting using the FilterPolicy and RateLimit custom resources.

A step-by-step path to the cloud native journey

What other considerations inform decisions around selecting a cloud native API gateway?

Service discovery: Monoliths, microservices, and meshes

One of the biggest questions in adopting a cloud native approach to service connectivity and communication is: “What technology – API gateway or service mesh – should I use to manage how microservice-based applications interact with each other?” The answer isn’t completely cut and dry. These technologies differ in how they work and should be considered from the end user’s experience, i.e, how to achieve a successful API call within a specific environment. Prospective users must understand the differences and similarities between the two technologies to determine when one should be used instead of the other or both.

Balancing the load

Load balancing deals with distributing network traffic among multiple backend services in the most efficient way to ensure scalability and availability. In Kubernetes, there are various choices for load balancing external traffic to pods, each with different tradeoffs. Understanding various load balancing strategies and implementations is required in helping to make the right choice and get started.

Load balancing in the cloud: AWS EKS and API gateways

At Ambassador Labs we’ve helped thousands of developers get their Kubernetes ingress controllers up and running across different cloud providers. Amazon users have two options for running Kubernetes: they can deploy and self-manage Kubernetes on EC2 instances, or they can use Amazon’s managed offering with Amazon Elastic Kubernetes Service (EKS).

If you are using EKS Anywhere, the recommended ingress and API gateway is Emissary-ingress. Overall, AWS provides a powerful, customizable platform on which to run Kubernetes. However, the multitude of options for customization often leads to confusion among new users and makes it difficult for them to know when and where to optimize for their particular use case.

Supporting modern protocols like HTTP/3

HTTP/3 is supported by 70%+ of browsers (including Chrome, Firefox, and Edge), and organizations are rolling out this protocol to gain performance and reliability. As leaders in the implementation of the HTTP/3 spec, Google and the Envoy Proxy teams have been working on the rollout since the beginning.

HTTP/3 is especially beneficial for users with lossy networks, such as cell/mobile-based apps, IoT devices, or apps serving emerging markets. The increased resilience through rapid reconnection and the reduced latency from the new protocol will benefit all types of internet traffic, such as typical web browsing/search, e-commerce and finance, or the use of interactive web-based applications, all of which can encounter packet loss of 2%+ on the underlying networks.

Adopting a cloud-native API gateway: Focus on speed, safety, and self-service

Going cloud native is a big decision, and not necessarily the right decision for every organization. Many technical and organizational considerations underpin the pros and cons. Once an organization reaches a level at which they are ready to implement developer self-service facilitated by platform and ops teams, cloud native is a viable choice for speed and safety.

Then, other considerations enter the frame. What technologies support the adoption of cloud-native practices and technologies? As posited earlier, the API gateway is the front door to your applications, meaning that the ubiquitous API gateway must also support the move to the cloud, such as continuous delivery, Kubernetes, and HTTP/3.

If you’re ready to open the door to the scalability, security and simplicity of Ambassador Edge Stack, start exploring how Ambassador Labs can help support your cloud native journey.

Related Categories