How Kubernetes is changing DevOps

October 17, 2022
How Kubernetes is changing DevOps

If you’re a developer and haven’t had the chance to use Kubernetes in your organization, you are yet to experience how Kubernetes is changing DevOps and you are likely experiencing one of the following situations: (a) You need a testing environment for your application, so you send a ticket that makes you wait or change your thinking flow to a different subject. (b) Your software works fine in your development environment but behaves differently in the IT-configured production environment. (c) Something goes wrong in production, and you have to work with the operations team to find out the reason and fix the problem. (d) You work hard to push new code fast but IT always slows down the process. How many did apply to you? Maybe three or four? Then you’re more than ready for Kubernetes DevOps.

How Kubernetes is changing DevOps

DevOps is a way of bridging the gap between development and operations teams so that high-quality code is released fast. However, it’s been more of a concept than reality due to the differences between development and production environments. Even when there’s a DevOps team in an organization, it’s difficult to collaboratively agree on the configuration and resource usage of the production environment without friction and wasted energy. Luckily, this is changing thanks to Kubernetes.

The first step towards this new era was the introduction of containers, popularized by Docker. These artifacts include an application’s code, dependencies, and configuration. A container platform – like Kubernetes – can run multiple instances of a given container image in isolation. Instances of a different container image running in the same platform are not affected by them nor have to comply with the same environment configuration. In addition, a container image is immutable by definition. Thus, if an environment deviates from its desired configuration, we can just kill that instance and spin up a new one.

Applications running in containers can by no means be monoliths, though. This implies a new way of developing, splitting massive solutions into many microservices, which in turn is better for DevOps, too. Teams can update individual microservices faster than pushing changes to a monolithic application.

The existence of containerized microservices called for a way of dynamically managing their lifecycle and horizontally scaling according to demand. Though there were several options initially, Kubernetes (aka K8s) shined because of its feature richness and growing community. Nowadays, it’s the most used container orchestration tool, every public cloud provider supports it and it’s available for every platform.

A key feature that makes Kubernetes so useful for DevOps is the layer of abstraction over the infrastructure it provides. Developers don’t need to worry about the underlying infrastructure anymore, and can focus on coding features instead. Given the pervasiveness of Kubernetes, this means that a containerized application can run anywhere, and thanks to container immutability, it’ll behave the same.

Kubernetes and Enterprise DevOps

It’s not just limited to DevOps. Here are some ways that showcase how Kubernetes is changing DevOps and can help an enterprise that goes cloud-native:

  • As we just said, Kubernetes can run anywhere – on-premise, at the edge, in private or public clouds. Because of this, it supports hybrid strategies and protects your enterprise from vendor lock-in.
  • While using Kubernetes, why not leverage the full potential of the cloud? Public cloud providers give you access to up-to-date, powerful machines that you don’t have to maintain yourself. Given that K8s dynamically allocates resources, it can also help you save by reducing overhead costs.
  • The infrastructure abstraction that we mentioned earlier means that now there’s a clean separation of concerns between runtime infrastructure operations and application deployment. IT can focus on cluster infrastructure, networking, security, disaster recovery, etc., while development teams can focus on creating new features, writing scripts for deployment, managing secrets, and so on.
  • Immutable containers deployed in development are just the same as those deployed in production. Hence, you’ll be avoiding changes between successful testing and the actual product launch.
  • Kubernetes lets you build your infrastructure as code (IaC). All the configuration is declarative and can be stored in a Git repository, making it possible to easily roll back to a production environment’s previous stable configuration.
  • Usually, you’ll have multiple running instances of any given deployed microservice. When an update is pushed to production, Kubernetes can handle it gradually, allowing easy tests in production, blue/green and canary deployments, zero downtime, and easy rollbacks.
  • All of the above result in a shorter time to market, which means increased productivity for your enterprise.

To illustrate these benefits, in the next section we’ll have a glance at how and why three well-known enterprises have adopted Kubernetes.

Join Napptive

Napptive enables developer self-service. We encourage you to try our playground and experience accelerated cloud-native development. It’s completely free, all you need to do is simply sign up and get started!

Where is Kubernetes used in DevOps?

Kubernetes is widely used. On its official webpage, we can find multiple case studies of some of its famous users. Here, we’re going to summarize three representative ones: Airbnb, Spotify, and ING.

Airbnb: from monolith to microservices

Initially, a small team of Airbnb engineers ran a Ruby on Rails monolith, which they called Monorail. As the engineering team grew larger, the modules became too tightly coupled. In 2015, the monolith would be blocked an average of 15 hours per week due to reverts and rollbacks. Airbnb had to shift to a service-oriented architecture. To address scaling, they started migrating to Kubernetes in 2017.

Spotify: swap homegrown orchestrator for Kubernetes

Spotify is an early adopter of containerized microservices due to the importance they give to fast, autonomous delivery. Back in 2014, the company used a homegrown orchestrator named Helios. However, over time they decided that having a small team working on Helios wasn’t as efficient as adopting a tool that was backed up by a large community. They started migrating to Kubernetes in 2018.

ING: banking on Kubernetes

This last case is special because of the industry’s limitations. While the back-end systems will remain in traditional architecture, ING is trying to move everything that makes sense to an internal public cloud to improve agility. With this in mind, the company empowered its teams to get autonomous and they soon started using Docker, Docker Swarm, Kubernetes, Mesos… In order to standardize their deployment process within the company’s strict security guidelines, they finally opted for Kubernetes.

All in all, these examples demonstrate that Kubernetes is the key to aiding DevOps in their task of releasing high-quality code fast.

If you want to propel your development, why not try our playground? It’s free, simply sign up and get started!

More like this

How to improve DevEx in your company

How to improve DevEx in your company

When your mission is to create applications to make the world a nicer place, your developers are the MVP of your organization. Developer Experience (DevEx), then, becomes more prominent when managing your resources, but what is it exactly, and how can you leverage it?...

Building cloud-native applications for beginners

Building cloud-native applications for beginners

Cloud-native application development has a myriad of benefits that help organizations stay ahead. To leverage this, cloud-native application development requires an extensive architecture that needs to be configured and managed over time, with new infrastructure...

Host your development infrastructure with Napptive – Part 2

Host your development infrastructure with Napptive – Part 2

In the first part, we discussed how the term "application" is often linked to end-user-focused software, leaving operational applications overlooked. This is why we began this series of articles showcasing the process of deploying the open-source Git Repository...