Why serverless computing is the future of all cloud computing. Since the introduction of cloud computing, the field experienced a series of back-and-forth evolutions, partly driven by cost factors that repeated themselves in various guises. However, in recent years, a new motivating factor might help cement the next evolution of cloud computing. By Michael Maximilien, David Hadas, Angelo Danducci II, Simon Moser.
Serverless computing was created to solve the problem of allocating cloud compute resources. Serverless was built to tackle this problem by adding automation that eliminates the need for users to predetermine the amount of compute resources necessary for their workload. As an open source example, Knative added scaling automation on top of Kubernetes-based cloud platforms. Knative makes the scaling decisions of workload services in line with actual service demand. As requests come in, Knative adjusts the compute resources to the demand. Knative scales the number of service pods infinitely (assuming Kubernetes has the resources) and when requests dry out, it scales down (to zero pods eventually).
In this blog post, we make the case and paint a vision that serverless computing is the future of cloud computing. The argument centers around the following premises:
- Cloud computing is at the center of the modern interconnected world. Most modern applications use cloud compute applications for aggregating and processing data and for constructing information that edge devices need.
- Cloud computing demand is expected to grow annually by 15%.
- Cloud computing is projected to reach 50% of IT spending in key market segments
- Cloud computing already consumes 1-1.5% of global energy and its growth represents an actual threat to the environment.
In this blog post, authors explained the motivations for serverless computing to be the future of all cloud computing workloads. We argued that this serverless-first future has various potential benefits for both cloud users and providers. Nice one!
[Read More]