Understanding Serverless Computing
Serverless computing is a fairly recent development that has caught on in the last 5 or 6 years. It is popular mainly with all public cloud platforms such as Amazon Web Services, Azure, Google Cloud, etc. Introduced by Amazon in the year 2014, with the launch of their service called Lambda. Since then, serverless computing has picked up pace because it is so convenient especially for certain practical business use cases around modern applications.
This article is focused on the various aspects of serverless computing across different popular public cloud platforms, including a few concepts, examples from real life, benefits, limitations, etc. So, let’s get started!
Serverless Computing at a glance
Popularly known as FaaS (Function-as-a-Service), the concept of serverless computing was born in 2006. The first commercially successful serverless computing services known as Lambda was launched in 2014 by Amazon Web services. Soon after Lambda was launched, AWS competitors also launched their own serverless services and platforms. Today, it is the most popular computing service in the world of public cloud platforms.
What is serverless?
Serverless computing is a type of cloud computing where the customer does not have to provision servers for the back-end code to run on, but accesses services as they are needed. In this model, the cloud provider runs the server, and dynamically manages the allocation of machine resources. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity. However, serverless doesn’t mean running something on a cloud without any infrastructure or a hardware resource.
Why serverless scores big
It is important to know why the public cloud platform vendors launched the concept of serverless computing. And why exactly is it getting popular by the day.
Imagine a situation where one has to run a specific custom-built program, or an API service for only a few times a day on cloud. In a conventional world, this would mean creating a VM instance, installing the necessary software and then deploying the code binaries. Next a scheduler on the VM will have to be set up to run the service/code/API as necessary. Now imagine running thousands of such custom applications on my cloud platform! An expensive proposition indeed.
Now imagine leveraging a shared cloud vendor resource that doesn’t need spinning up VMs. Giving you the option to run your custom code written in the most popular languages by virtue of a trigger. And promises top class availability and resilience on the platform too. In a volatile microservices-based modern web application running hundreds of functions, one can save a huge amount of resources by deploying this on a serverless architecture.
Simple serverless examples
Here are some simple use cases which come in handy in real life. There are a plenty of serverless resources available with each public cloud vendor for tasks like:
- Triggering a nightly job that runs less than 5 minutes and finishes the task.
- Triggering an email to concerned members when a specific workflow status becomes “Completed”
- Processing an image file / PDF doc when it lands up on a storage service on a cloud platform
Serverless in popular public cloud platforms
Here are the serverless computing services available under popular public cloud platforms.
I am personally betting on HR services getting to deliver more hyper-personalised services across the employee life cycle with the aid of digital and cognitive platforms.
Amazon Web Services Starting with Lambda in 2014, AWS has come a long way in serverless computing. 2017 has been a remarkable year with AWS announcing a lot of serverless twins of existing services and has only kept increasing. AWS also provides a serverless application repository, which is a huge pool of serverless components built on top of AWS services, available for free and can be used by anyone who is interested.
Microsoft Azure Microsoft makes serverless available to its Azure consumers via Functions. A code repository of Azure functions is also available.
Pivotal Cloud Foundry Pivotal is working on its new serverless Pivotal Function Service. It comes with a host of pluggable features, scalability, support on Kubernetes and Istio, container-based workflows and polyglot programming.
Google Cloud Platform Google cloud functions provide the ability to build serverless application backends, real-time data processing and to build intelligent applications. Google provides a plugin for serverless framework. The code base for the plugin is available too.
IBM Cloud Functions Based on Apache OpenWhisk, IBM Cloud Functions is a polyglot FaaS programming platform for developing lightweight code that performs scalable execution on demand. It offers an array of functionality for Back-ends, Mobility, Data, Cognitive with IBM Watson, IOT, event stream processing, conversational bots and scheduling.
Serverless.com Another framework that is worth mentioning is serverless.com. Started as an open-source project in 2015, it has now grown into a mature enterprise grade framework. Serverless.com is now supported on AWS, Azure, GCP, Kubeless, Cloudflare and Openwhisk.
Beware of serverless computing misconceptions
Serverless doesn’t mean running without infrastructure. The name is misleading if we consider its literal meaning. Serverless is a collection of software components that run on an underlying hardware, with a difference that the allowance doesn’t have to be paid for the infrastructure components/services when not in use, as opposed to a conventional usage of a VM on cloud. The serverless functions cannot run for longer duration and can last for only minutes. So, it will not be a fit for every practical business scenario.
Serverless is often confused with PaaS since both operate on a similar shared infrastructure model. But serverless was built as an enabler to do specific tasks, whereas a PaaS is built for tasks such as email service, database services, messaging/queueing, caching/In-memory for performance, application integration services, security services, etc.
The pricing model of serverless computing too is different from that of PaaS. While PaaS can be persistent, serverless cannot. Having said that, each public cloud service provider is now trying to redesign or launch their existing PaaS services to adopt a serverless model.
Serverless and Containers – a quick comparison
While these are two contemporary technologies, they are quite different from each other. Serverless is all about executing functions or services on a cloud platform, while Container, on the other hand, is a hosting platform that can run a small service or an API to a large modern or legacy business application in one or more compartmentalized units.
Category | Serverless | Containers |
---|---|---|
Hosting Platform | Serverless is hosted on shared infrastructure by the respective cloud vendors and offered as one or more consumable services. | Hosted either on Docker platform or on VM’s depending on the type of usage. Docker platforms is a software stack and it requires infrastructure to run. |
Architecture | Functions are defined to run independently and in the case of a larger flow, they need to be linked via a triggering point. | Can be hosted on single or multiple containers. Needs orchestration to manage the related cluster of containers for a given application |
Supported software stack | Supports all popular open-source programming stack for writing code components for a serverless function. | Can host almost all the software stack available, both legacy and modern. Details of supported and unsupported images are available in Docker hub documentation. |
Deployment Platforms | Rubs on a public cloud platform which is hosting the service. Operating systems become irrelevant as the cloud provider controls the whole services. | Can run on Windows or Linux platforms. |
Monitoring & Control | Service provider gives the option to monitor them using their default monitoring services available on the cloud and provides additional mechanism to programmatically monitor the status. But, the control is not given to the end-user. | The cluster of containers are maintained by Docker Swarm or other popular monitoring tools like Kubernetes, Prometheus, Engine Yard etc. The environment can be controlled using these monitoring tools and docker command line. |
Pricing | Cloud service providers typically charge for the underlying infrastructure and other services consumed. | The Enterprise version of Docker is chargeable. Community edition is free. Similarly, Linux/Windows containers are chargeable on a per VM basis. |
Availability & Scalability | Highly available and scalable as per each cloud service provider. | Depends on the underlying infrastructure on which the containers are running. |
Security | Native options for security as well as support for popular open-source security mechanisms. | Security happens at underlying infrastructure and the software stack levels. Supports all popular security mechanisms and tools. |
Persistence | Runs only shorter duration (minutes) and needs access to shortage services or database for persistence. | Docker uses volumes and for container based VM’s disks can be considered for persistence. |
This brings us to another question – what are serverless containers?
AWS launched Fargate, a serverless container platform during its ‘Re-Invent’ event in 2017. Microsoft made it possible with what is known as Azure Container Instances. Google launched in its Google Next event recently in 2021, a platform called Cloud Run. “It is a serverless compute platform that lets you run any stateless request-driven container on a fully managed environment. In other words, with Cloud Run, you can take an app—any stateless app—containerize it, and Cloud Run will provision it, scale it up and down, all the way to zero!”
There could be more such services/platforms available. But it is important to understand how the concepts of serverless and containers are combined in each of them and whether they really qualify for the need that you may have on hand.
Serverless Benefits, Limitations & Challenges, and Areas of improvement
Every technology has its own set of benefits, limitations, challenges and scope for improvement. Given below is a summary of the same:
Benefits
- Avoids maintenance / management of underlying infrastructure.
- Cost efficiency - no need to spin up dedicated infrastructure for infrequent tasks.
- Versatility to perform and complete any task that is smaller in nature, across the board.
- Scalability - offers the freedom to be elastic on a need basis.
- Support for writing the functions in multiple programming languages.
Limitations & Challenges
- Not suitable for long running tasks.
- Need storage service or a database service if any data needs to be retained.
- Applications built have the risk of vendor lock-in as it will be based on the underlying serverless platform of the cloud provider.
- Monitoring may be a challenge if there are too many serverless tasks running at the same time.
- Design for any small component or large application should be built in a way that it takes care of failure. scenarios and error handling. For example, if a specific serverless function crashes, recovering or restarting the same service again could be a challenge.
- Fail-fast mechanism/approach is advised for any POC/prototype to be built on a serverless architecture/model. This way, wastage of time, human resource and money can be minimized.
Areas of Improvement
- Increase the active/run duration and attain persistence.
- New models/designs of underlying infrastructure (compute) that will be more relevant to serverless from longevity, performance, scalability, availability perspective.
- Need for common serverless architecture/framework to avoid the risk of vendor lock-in and making them agnostic of underlying cloud platforms.
- Cross cloud portability of existing serverless services or mechanisms to fast-track portability between cloud platforms for situations like an API platform or a serverless rich web application.
- Proper tool (IDE) for the development community to easily build functions and applications on serverless.
Serverless as a technology is still evolving across different cloud platforms and also in the open-source community. Early adoption of this technology will certainly give benefits to enterprises. While more and more services are getting shifted to serverless models on cloud platforms, the limitations and challenges are still prevalent.
As an architect or a serverless evangelist, one has to strive for building serverless functions and applications that are cloud agnostic (no vendor lock-in). This is possible through externalization/segregation of actual business functionality from cloud platform specific serverless functions or services, which needs to be standardized and published to the relevant audience to promote reusability. Such models should evolve in a couple of years from now.
Cloud careers at Mphasis
At Mphasis, we help our customers keep pace with the rapidly changing world by providing best-in-class cloud solutions. We specialize in a wide range of cloud services including Data Center/Cloud Migrations, Storage Solutions & Services, DevOps Solutions and Cloud Computing Services.
As a part of our cloud computing team, you will develop and implement cloud strategy and vision, enterprise cloud security, cloud solutions design and delivery, cloud migrations, infrastructure automation across a wide range of platforms.
Cloud computing is an exciting world. And we are always looking for the best and the brightest to join us. Explore cloud computing Jobs, cloud migration jobs, hybrid cloud engineer jobs, DevOps engineer jobs at Mphasis.
Mphasis is currently looking for Cloud Solution Architects. We also need talented people for our Cloud Migration and Cloud Management teams. Learn more about new opportunities in Cloud Migration