Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Learning Kubernetes Security

You're reading from   Learning Kubernetes Security A practical guide for secure and scalable containerized environments

Arrow left icon
Product type Paperback
Published in Jun 2025
Publisher Packt
ISBN-13 9781835886380
Length 390 pages
Edition 2nd Edition
Arrow right icon
Author (1):
Arrow left icon
Raul Lapaz Raul Lapaz
Author Profile Icon Raul Lapaz
Raul Lapaz
Arrow right icon
View More author details
Toc

Table of Contents (18) Chapters Close

Preface 1. Kubernetes Architecture FREE CHAPTER 2. Kubernetes Networking 3. Threat Modeling 4. Applying the Principle of Least Privilege in Kubernetes 5. Configuring Kubernetes Security Boundaries 6. Securing Cluster Components 7. Authentication, Authorization, and Admission Control 8. Securing Pods 9. Shift Left (Scanning, SBOM, and CI/CD) 10. Real-Time Monitoring and Observability 11. Security Monitoring and Log Analysis 12. Defense in Depth 13. Kubernetes Vulnerabilities and Container Escapes 14. Third-Party Plugins for Securing Kubernetes 15. Other Books You May Enjoy 16. Index Appendix: Enhancements in Kubernetes 1.30–1.33

Microservices model

One of the most important aspects of Kubernetes to understand is that it is a distributed system. This means it comprises multiple components distributed across different infrastructure, such as networks and servers, which could be either virtual machines, bare metal, or cloud instances. Together, these elements form what is known as a Kubernetes cluster.

Before you dive deeper into Kubernetes, it’s important for you to understand the growth of microservices and containerization.

Traditional applications, such as web applications, are known to follow a modular architecture, splitting code into an application layer, business logic, a storage layer, and a communication layer. Despite the modular architecture, the components are packaged and deployed as a monolith. A monolithic application, despite being easy to develop, test, and deploy, is hard to maintain and scale.

When it comes to a monolithic application, developers face the following inevitable problems as the applications evolve:

  • Scaling: A monolithic application is difficult to scale. It’s been proven that the best way to solve a scalability problem is via a distributed method.
  • Operational cost: The operation cost increases with the complexity of a monolithic application. Updates and maintenance require careful analysis and enough testing before deployment. This is the opposite of scalability; you can’t scale down a monolithic application easily as the minimum resource requirement is high.
  • Security challenges: Monolithic applications present several security challenges, particularly when addressing vulnerabilities. For instance, rebooting for patching can be complex and time-consuming, while encryption key rotation is often difficult to implement. Additionally, monolithic architectures face increased risks of denial-of-service (DoS) attacks due to scaling limitations, which can impact availability. Here are some clear examples of issues that you may face:
    • Centralized logging and monitoring can be more challenging in monolithic applications, making it harder to detect and respond to security incidents in a timely manner
    • Implementing the principle of least privilege (where each component has only the permissions it needs) is more difficult in a monolithic application because all components run within the same process and share the same permissions
    • Monolithic applications may not easily support modern security practices such as microservices, containerization, or serverless architectures, which can provide better isolation and security controls
  • Longer release cycle: The maintenance and development barriers are significantly high for monolith applications. When there is a bug, it takes a lot of time for developers to identify the root cause in a complex and ever-growing code base. The testing time increases significantly. Regression, integration, and unit tests take significantly longer to pass with a complex code base. When the customer’s requests come in, it takes months or even a year for a single feature to ship. This makes the release cycle long and impacts the company’s business significantly.

These problems create a huge incentive to break down monolithic applications into microservices. The benefits are obvious:

  • With a well-defined interface, developers only need to focus on the functionality of the services they own.
  • The code logic is simplified, which makes the application easier to maintain and easier to debug. Furthermore, the release cycle of microservices has shortened tremendously compared to monolithic applications, so customers do not have to wait for too long for a new feature.

The issues with a monolith application and the benefits of breaking it down led to the growth of the microservices architecture. The microservices architecture splits application deployment into small and interconnected entities, where each entity is packaged in its own container.

However, when a monolithic application breaks down into many microservices, it increases the deployment and management complexity on the DevOps side. The complexity is evident; microservices are usually written in different programming languages that require different runtimes or interpreters, with different package dependencies, different configurations, and so on, not to mention the interdependence among microservices. This is exactly where Docker comes into the picture. Container runtimes such as Docker and Linux Containers (LXC) ease the deployment and maintenance of microservices.

Further, orchestrating microservices is crucial for handling the complexity of modern applications. Think of it like Ludwig van Beethoven leading an orchestra, making sure every member plays at the right moment to create beautiful music. This orchestration guides all the connected and independent components of an application to work together, completely integrated. Without it, the service will have many issues communicating and cooperating, causing performance problems and a messy network of dependencies that make scaling and managing the application very difficult.

The increasing popularity of microservices architecture and the complexity mentioned here led to the growth of orchestration platforms such as Docker Swarm, Mesos, and Kubernetes. These container orchestration platforms help manage containers in large and dynamic environments.

Having covered the fundamentals of microservices, in the upcoming section, you will now gain insights into how Docker has evolved during past years.

Visually different images
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Learning Kubernetes Security
You have been reading a chapter from
Learning Kubernetes Security - Second Edition
Published in: Jun 2025
Publisher: Packt
ISBN-13: 9781835886380
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Modal Close icon
Modal Close icon