How to scale microservices without using Kubernetes

Babish Shrestha
Director of Technology

The great thing about microservices is that if one of your software services fails, the rest of the system stays online. For global streaming giants and hyperscalers with teams of developers handling multiple services – container orchestration makes a lot of sense.

For the rest of us, with a small team of developers working on one or two business-critical products, is it really necessary to invest in a container orchestration system such as Kubernetes, Red Hat OpenShift, Docker Swarm, or Rancher?

The short answer is no. And the good news is that it’s possible to power a scalable microservice architecture without using Kubernetes or the like. The question is how? 

Let’s start by taking a closer look at microservices.

What are microservices?​

In software development, microservices is an architectural style and organizational approach which breaks down applications into a loosely coupled collection of independent services that communicate via application programming interfaces (API). 

The alternative is the more traditional monolithic architecture, in which all the processes run as a single service. The problem with monolithic architecture is that to scale one service, you have to scale the entire application. 

Electrical circuits provide a good analogy for microservices versus monolithic architecture:

  • Monolithic architecture is comparable to bulbs running on a series circuit – if one bulb blows, the entire string of bulbs goes out, and each bulb must be tested or checked to see which one has blown 
  • Microservices are similar to bulbs running on a parallel circuit – when one bulb blows, the others keep working, and the blown bulb is easy to identify and replace. 

Right now, in software development, microservices are trending. And with good reason. The main benefit of microservices is that they are loosely coupled and deployed separately, and therefore easier to independently scale. This means maintenance and testing can take place on a particular service, without disrupting other functionality. 

With microservices, which lend themselves to an Agile approach to software development, applications can be developed and scaled much faster. This type of architecture also makes it much more convenient to implement innovative and new features and expedite the time to market.  You can read more about the advantages of microservices in our blog post about making the journey from monolith to microservices.

“With container orchestration systems like Kubernetes, your developers have to learn an extra technology stack – things like deployment, deployment services, and APIs. And there are time and cost implications associated with that.”

Why use a container orchestration system?​

Container orchestration systems enable you to manage and control the deployment and scaling of multiple containerized services. It’s especially useful when you have a large number of microservices because you can essentially orchestrate highly available services with limited configuration.

As we’ve already discussed, when services are loosely-coupled like this, it gives you greater flexibility to scale specific features, and more control when a particular feature needs attention, without bringing down the entire system. With the right code, using an orchestration system can also introduce a certain level of automation to your workflows.

Larger organizations could be running hundreds, or even thousands, of microservices which they manage through a container orchestration system. They’re ideal for large and complex organizations which are scaling at pace because they provide limitless scalability and can be run anywhere, whether that’s on premise, in the cloud, or in a hybrid environment. Of course, with containers, it’s also possible to pick up and deploy the same features across different environments.

But whether you’re self-hosting or opting for a managed solution, running a container orchestration system – even one that’s open source – can be expensive in terms of operational costs, training, and staffing. 

“The good news is you can still build microservices using readily available cloud platforms, a serverless platform where you can replicate the same benefits of Kubernetes, without the need to learn a new programming language and how to manage it effectively.”

The advantages of running serverless microservices​

There’s no denying the business benefits of developing your software product using a microservice architecture. It provides:

  • Greater flexibility to scale a particular application feature, rather than the entire app, saving time, money, and disruption to services. 
  • The opportunity for development teams to work autonomously on a specific microservice, thereby shortening development cycles and time to market.
  • The freedom to test our new features through a process of continuous improvement, with easy roll back should you need it. 
  • Building blocks of code which have the potential to be reused for other features, therefore saving development time and decreasing the time to market.

But for many small and medium-sized software companies, it’s likely that the cost in time and resources in learning, setting up, deploying a container orchestration system will outweigh the business benefits – unless it fundamentally enables your business to overcome a specific challenge. But that doesn’t mean you can’t leverage the benefits of a microservice architecture. The alternative? Have you considered going serverless?

Cloud solutions like AWS, Azure, and Google Cloud provide a ready-made infrastructure which enables you to run code without managing servers. It’s cost-effective, because you only pay for what you use in terms of compute time and your containers are managed, so orchestration and security are taken out of your hands. 

With a serverless solution, your containers are still portable, and it still provides the scalability you need without paying out for a container orchestration system. Another advantage of a serverless solution is that within the ecosystem, there are already existing ready-made services and software as a service (SaaS). These can save valuable development time, and speed up the development time through the use of pre-made services such as authentication and API gateways. 

In a recent project, Proshore helped to deploy microservices for an e-commerce platform serving multiple buyers with thousands of products. With a container orchestration system, it would have taken around 2 months to train developers in how to manage the system before development could even begin. 

To build this application in microservice architecture, a microservice backend was built in AWS Lambda using language such as NodeJS and Python. AWS Aurora was used as a main database with master slave architecture. An AWS API gateway was used to connect backend to frontend and other devices. The front end was hosted in EC2 instance. Besides this, other AWS services were used for caching, large batch jobs, and messaging.

To find out how Proshore helped the platform to seamlessly sync thousands of different product prices and data from different ERP systems, every day, read our blog post on ERP excellence.

11 best practices for Cypress Automation

Faster and more efficient QA testing enables new, high-quality products and features to reach the market sooner. For that reason, there are many tools and technologies on the market that…
binaya dhungana qa engineer
Binaya Dhungana
QA Engineer

Proshore Bootcamp #4: Kick-starting Careers In Software Development

In its fourth edition, Proshore Bootcamp once again gave aspiring IT professionals in Nepal the chance to gain valuable experience working in software development. Building on the success of our…
Krishma Shrestha
HR Manager
3 minutes

What it’s like being a QA Automation Engineer

There are many career paths to take in software development. Quality Assurance (QA) is a popular choice for developers looking to specialize in quality and reliability. It’s also common for…
binaya dhungana qa engineer
Binaya Dhungana
QA Engineer