Introduction to Serverless Computing and Architecture



2175 views Backend System Design



Serverless is a cost-efficient way to host your APIs and it forms the crux of systems like Chatbots and Online Judge.

Serverless does not mean that your code will not run on the server; it means that you do not manage, maintain, access, or scale the server your code is running on.

The traditional way to host APIs is by spinning up a server with some RAM, and CPU. Say the resources make your server handle 1000 RPS, but you are getting 1000 RPS only 1% of the time which means for the other 99% you are overprovisioned.

So, what if there was an Infrastructure that

  • scales up and down as per the traffic
  • is billed per execution
  • is self-managed maintained and fault-tolerant

These requirements gave rise to Serverless Computing.

Real-world applications

Chatbot

Say, we build a Slack chatbot that responds with the Holiday list when someone messages holidays . The traffic for this utility is going to be insignificant, and keeping a server running the whole time is a waste. This is best modeled on Serverless which is invoked on receiving a message.

Online Judge

Every submission can be evaluated on a serverless function and results can be updated in a database. Serverless gives you isolation out of the box and keeps the cost to a bare minimum. It would also seamlessly handle the surge in submissions.

Vending Machine

Upon purchase, the Vending machine would need to update the main database, and the APIs for that could be hosted on Serverless. Given the traffic is low and bursty, Serverless would help us keep the cost down.

Scheduled DB Backups

Schedule daily DB backups on the Serverless function instead of running a separate crontab server just to trigger the backup.

Batch and Stream Processing

Use serverless and invoke the function every time a message is pushed on the broker making the system reactive instead of poll-based.

Advantages

  • No need to manage and scale the infra
  • The cost is 0 when you do not get any traffic
  • Scale is out of the box; so no capacity planning is needed

Disadvantages

  • Takes time to serve the first request as the underlying infra might boot up
  • The execution has a max timeout, so your job should complete within the limit
  • Debugging is a challenge
  • You are locked in on the vendor you chose

When NOT to use Serverless

  • Load, usage, and traffic pattern is consistent
  • Execution will go beyond the max timeout
  • You need multi-tenancy

When to use Serverless

  • Quick build, prototype, and deploy the changes
  • Usecase is lightweight
  • Traffic is bursty

Arpit Bhayani

Arpit's Newsletter

CS newsletter for the curious engineers

❤️ by 17000+ readers

If you like what you read subscribe you can always subscribe to my newsletter and get the post delivered straight to your inbox. I write essays on various engineering topics and share it through my weekly newsletter.




Other essays that you might like


An in-depth introduction to Rolling Deployments

944 views 46 likes 2022-05-27

One of the simplest deployment strategies that make deployment a breeze is Rolling Deployment. It is the most widely ado...

Implementing Vertical Sharding

1124 views 75 likes 2022-05-25

Sharding is super-important when you want to handle the traffic that cannot be handled through one server. Sharding come...

An in-depth introduction to Blue Green Deployments

1309 views 60 likes 2022-05-18

Deployments are a pain if we are unsure about our release changes. But sometimes even if we know our changes well, somet...

An in-depth introduction to Canary Deployments

1982 views 117 likes 2022-05-16

Deployments are stressful; what if something goes wrong? What if you forgot to handle an edge case that was also missed ...


Be a better engineer

A set of courses designed to make you a better engineer and excel at your career; no-fluff, pure engineering.


System Design Masterclass

A masterclass that helps you become great at designing scalable, fault-tolerant, and highly available systems.

Enrolled by 700+ learners

Details →

Designing Microservices

A free course to help you understand Microservices and their high-level patterns in depth.

Enrolled by 17+ learners

Details →

GitHub Outage Dissections

A free course to help you learn core engineering from outages that happened at GitHub.

Enrolled by 67+ learners

Details →

Hash Table Internals

A free course to help you learn core engineering from outages that happened at GitHub.

Enrolled by 25+ learners

Details →

BitTorrent Internals

A free course to help you understand the algorithms and strategies that power P2P networks and BitTorrent.

Enrolled by 42+ learners

Details →

Topics I talk about

Being a passionate engineer, I love to talk about a wide range of topics, but these are my personal favourites.




Arpit's Newsletter read by 17000+ engineers

🔥 Thrice a week, in your inbox, an essay about system design, distributed systems, microservices, programming languages internals, or a deep dive on some super-clever algorithm, or just a few tips on building highly scalable distributed systems.



  • v12.4.4
  • © Arpit Bhayani, 2022

Powered by this tech stack.