‘Service mesh’ is a term that is relatively new and has gained visibility in the past year. It’s a configurable infrastructure layer for a microservices application that makes communication between service instances flexible, reliable, and fast.
Why are people talking about ‘service meshes’?
Modern applications contain a range of (micro)services that allow it to run effectively. Load balancing, traffic management, routing, security, user authentication - all of these things need to work together properly if the application is going to function as intended.. Managing these various services, across a whole deployment of containers, poses a challenge for those responsible for updating and maintaining them.
How does a service mesh work?
Enter the Service mesh. It works delivering these services from within the compute cluster through a set of APIs. These APIs, when brought together, form the ‘mesh’.. This makes it much easier to manage software infrastructures of particular complexity - hence why organizations like Netflix and Lyft have used them..
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
Trick or treat?
With the service meshes addressing some of the key challenges when it comes to microservices, this is definitely a treat for 2018 and beyond.
NGINX Hybrid Application Delivery Controller Platform improves API management, manages microservices and much more!
Kong 1.0 launches: the only open source API platform specifically built for microservices, cloud, and serverless
OpenFaaS releases full support for stateless microservices in OpenFaaS 0.9.0