HomeITWhat is serverless computing?

What is serverless computing?

Serverless computing is an app development and execution framework where developers write application code and run applications without deploying or administering servers or back-end resources.

Serverless is not “no servers”. Although serverless computing is not called that, the servers in serverless computing are operated by a cloud service provider (CSP). Serverless – the way those servers look, function and behave to the developer (the developer doesn’t see them, nor manage them, or do anything with them).

With serverless computing, developers can write best front-end application code and business logic. All they have to do is write their application code and run them in CSP controlled containers.

The Cloud provider takes care of the rest — provisioning the cloud infrastructure needed to execute the code, up- and down-scaling the infrastructure according to need — and also takes care of all regular infrastructure management and support, like operating system updates and patches, security, capacity planning, system monitoring, etc.

Serverless never charges for idle capacity, nor does it charge developers. The provider spins up and provisioned computing capacity as needed while the code is running and spins them down (known as “scaling to zero”’) when it is no longer running. It is billed when execution commences and terminated when execution ceases, usually, the pricing is dependent on execution time and resources involved.

Serverless computing, microservices and containers are three technologies that are at the center of cloud native application development.

The origins of serverless

Serverless was launched in 2008 by Google with the launch of Google App Engine (GAE), which lets you build and host web applications in Google-owned data centres. For example, a developer could write software and run it on Google’s Cloud and forget about server maintenance such as patching or load balancing; it was handled by Google.

The phrase ‘’serverless’’ appeared in a tech article by cloud computing expert Ken Fromm back in 20122. The first serverless platform was launched by Amazon in 2014, called AWS Lambda. Known after lambda calculus and programming functions, the FaaS-based AWS Lambda gave the serverless computing paradigm mass market credibility and quick adoption by programmers because they can run code when something happens without having to worry about managing servers. Microsoft Azure Functions and Google Cloud Functions introduced serverless in 2016.

The serverless ecosystem

Serverless and FaaS

Serverless is more than the function as a service (FaaS) — the cloud computing service where developers run code or containers when something happens or when a request is made without defining or owning the infrastructure in which the code is executed.

FaaS is the compute model at the heart of serverless and often the two terms are interchanged. Serverless is more like a complete stack of services that responds to certain events or requests, and goes away when not in use — and for which provisioning, administration and billing are done by the cloud provider and hidden from developers.

These services are in addition to FaaS, including databases and storage, Application programming interface (API) gateways, and event driven architecture.

Serverless databases and storage

Data layer — Databases (SQL, NoSQL) and Storage (mainly object storage). A serverless solution to these technologies is a move away from provisioning “instances” that are locked with capacity, connection and query limits, and towards models that scale in line with demand for infrastructure and cost.

API gateways

API gateways are proxying devices for web app activity and serve up HTTP method routing, client id and secrets, rate limits, CORS, API usage visibility, response view and API sharing rules.

Serverless and event-driven architecture

For event-based, stream-processing workloads, serverless systems are excellent – especially the open-source Apache Kafka event streaming system.

Serverless functions that are automatically programmed are stateless and only responsible for single events. Such activities are now an integral part of event-driven architecture (EDA), a software design framework based on publishing, capturing, processing and storing events.

Event producers in an EDA environment (microservices, APIs, IoT sensors, etc) – These event producers communicate live events to event consumers, which trigger processing routines. For instance, when Netflix shows a new original series, different EDA services sit on hold waiting for the release announcement, and a cascade of updates notify users.

Most other companies based on web and mobile apps with end users (Uber, DoorDash, Instacart) are event driven.

Serverless versus PaaS, containers and VMs

Since serverless, PaaS, containers, and VMs are all equally important in the cloud application development and compute ecosystem, let’s gauge how serverless stacks up against them on a few metrics.

  • Provisioning time: In milliseconds on serverless vs minutes to hours on the other models.
    Administrative complexity: Zero with serverless, and from light to medium to heavy with PaaS, containers and VMs respectively.
  • Uptime: CSPs do everything when it comes to serverless. Same with PaaS but containers/VMs are quite high maintenance like maintaining / updating operating systems, container images, connection etc.
  • Scaling: Scaling — Autoscaling to zero, autoscaling etc is instant and inherent in serverless. Other models use slow automatic scaling — it’s up to you how you configure the autoscaling, and no scaling to zero.
  • Provisioning: There is no capacity planning needed for serverless. The other models need a combination of automatic scaling and capacity planning.
  • Statelessness: Default with serverless, scalability is never a concern, state is stored in an external service or resource. PaaS, containers, and VMs can be HTTP based, keep a socket or connection open for years, and store state in memory when the calls come.
  • HA & DR: High availability and DR are delivered by Serverless at no additional effort and cost. The other versions are expensive and require some extra work to run. VMs and containers restart infrastructure on their own.
  • Efficiency: Serverless is 100% performance because there’s no inactive capacity — only on demand it is called. All the other models have at least a bit of idle capacity.
  • Fees and savings: Serverless is billed in 100-ms increments. PaaS, containers and VMs are often per hour or per minute.

Serverless, Kubernetes and Knative

Kubernetes is an open-source container orchestration software that can help you deploy, manage and scale containers. This automation is a game changer when it comes to building container applications.

Most serverless applications run on containers. But only when using a serverless platform developed by a dedicated software for Kubernetes with a particular cloud provider can Kubernetes run serverless applications on its own.

Knative: It’s an open-source Kubernetes extension that gives you a serverless environment. It allows any container to be executed as a serverless job on any Kubernetes-enabled cloud, whether it is a container with a serverless function inside or other app code (e.g., microservices). Knative removes the code and takes care of the network routing, event injection and autoscaling for serverless development.

Knative is transparent to developers. They compose a container by Kubernetes and Knative takes care of the rest, serverless workload running the container.

Pros and cons of serverless

Pros

Several technical and commercial advantages of serverless computing for individual developers and teams of enterprise developers are available:

  • Higher developer productivity: As discussed serverless lets development teams write code and not infrastructure. It leaves developers time to experiment and optimize their front-end app functions and business logic.
  • Fee for execution only: The meter starts on request generation and stops on completion. Compare this with the IaaS compute scenario, where customers pay for the physical servers, VMs and other infrastructure needed to run applications, from the time they order them up until the moment they call it quits.
  • Work with anything: Serverless is polyglot and will allow your developers to write code in whatever language or framework (Java, Python, JavaScript, node.js) they feel more comfortable in.
  • Accelerated development/DevOps processes: Serverless is easier to deploy and – more generally – easier to DevOps as developers do not need to build the infrastructure to integrate, test, release and deploy code builds to production.
  • Economical: For some workloads (like embarrassingly parallel processing, stream processing, specific data processing), serverless computing is both faster and cheaper than other computations.
  • Lower latency: In serverless, code can be run near the end-user which decreases latency.
  • Usage analytics: Serverless systems offer near-real-time visibility into system and user time, and can aggregate usage analytics in aggregated fashion.

Cons

So, while there are many positives with serverless, there are also some negatives:

  • Less control: With serverless, a company is delegating server control to a third-party CSP, and no longer manages hardware and execution environments themselves.
  • Vendor lock-in: Each provider has its own serverless functionality and features which are incompatible with others.
  • Slow start: or “cold start” — slow start can also impact serverless applications’ speed and responsiveness when running in real-time demand scenarios.
  • Advanced testing and debugging: Debugging can be more complex when implementing serverless computing because the developer cannot observe back-end processes.
  • Higher cost to run long applications: Serverless execution technologies are not built to run code for longer durations. So, longer running tasks are more expensive than dedicated server or VM solutions.

Serverless and security

The CSPs offer security to support the serverless applications, but it’s the client that has to protect the application code and data through a shared responsibility model. For serverless, there are automated security policies and solutions in the cloud such as security information and event management (SIEM), identity and access management (IAM), and threat detection and response.

Implementing DevSecOps best practices secures serverless technologies for developers. DevSecOps (pronounced “development, security and operations”) is an app development process that automates security and security practices at every step of the software development lifecycle, from design to integration, testing, delivery and deployment.

Serverless and sustainability

Serverless computing can also save organizations energy and reduce the carbon footprint for IT operations compared to an on-premise data center model.

Moreover, in serverless approach, enterprises reduce their emissions by reducing resource use, only paying for and only using the resources required. This attribute means less energy is spent on running or excess operations.

Serverless use cases

Because of the inherent characteristics and advantages, serverless best suited for microservices, mobile back-ends, data/event stream processing use cases.

Serverless and microservices

Now the most common serverless application uses are to run microservices architectures. The microservices concept is about forming small services, that perform one function and talk to each other via APIs. Although microservices can also be built and run using PaaS or containers, serverless really caught on due to its small bits of code, natural and automatic scaling, rapid provisioning and no charge for un-occupied capacity.

API backends

Everything that is done (or does) in a serverless architecture can be transformed into an HTTP endpoint and used by web clients. When configured for the web these are called web actions. You get web actions and then you can collect them into an API that supports more functionality with the API gateway offering more security, OAuth3, rate limiting, custom domain support.

Open Liberty InstantOn (CRIU)

Open Liberty InstantOn4 is a new approach for serverless application speedup. With InstantOn you can checkpoint your active Java application process during application build and drop the checkpoint back in production.

The restore is quick (under 100 s of milliseconds), so this is great for serverless. Because InstantOn is a copy of your current application, it does the same after restore including providing the same high throughput. This can help businesses implement serverless for new cloud-based applications and brings serverless to legacy enterprise.

Data processing

Serverless is ideal for structured text, audio, image and video data analysis, transformation, validation and cleansing. For PDF, audio normalization, image processing (rotation, sharpening, noise reduction, thumbnail creation), OCR, video transcoding, and more, developers can deploy it as well.

Massively parallel compute and “map” operations

Any gimmicky-parallel task would be a decent application for a serverless runtime and each parallelisable task would be represented as a single action call. Examples are Data exploration & processing (cloud object store), MapReduce and web scraping, business process automation, hyperparameter tuning, Monte Carlo simulations, genome mining.

Stream processing workload

Managed Apache Kafka, FaaS and a database or storage platform all add up to a powerful set of platforms for real-time expansions of data pipelines and streaming applications. These architectures are perfect for all types of stream ingestions of data (validation, cleansing, enrichment and transformation), IoT sensor data, application log data, financial market data and business data streams (from other data).

AI and serverless

Serverless offers the automated scalability to deploy AI and ML workloads for the best performance and speed of innovation.

Hybrid cloud and serverless

Serverless computing can facilitate a hybrid cloud solution by allowing agility, portability and scalability for dynamic workloads on-premises, in the public cloud, in the private cloud and at the edge.

Common applications for serverless

Serverless has solutions for the most common applications we use in the current era: CRM, HPC, big data, business process automation, video-streaming, gaming, telemedicine, digital commerce, chatbot development and more.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

RELATED ARTICLES

Latest News