Contrary to the vernacular, the technology that has burst onto the cloud computing scene in the past two years still does in fact run on servers. The name serverless instead highlights the fact that end users don't have to manage servers that run their code anymore.
Perhaps this sounds familiar. Technically, in a public infrastructure-as-a-Service (IaaS) the end user isn't physically managing servers either; that's up to the Amazon Web Services and Microsoft Azures of the world.
But so-called serverless computing takes that idea a step further and executes code that developers write using only the precise amount of compute resources needed to complete the task, no more, no less. When a pre-defined event occurs that triggers that code, the serverless platform executes the task. The end user doesn't need to tell the serverless provider how many times these events or functions will occur. Customers pay a fraction of a penny every time a function is executed. Some believe Functions-as-a-Service (FaaS) or event-driven computing is a better name.
"The way we came to think about it is there are different levels of abstraction that developers can interact with from an infrastructure perspective," explains IBM Vice President of Cloud Product Management Damion Heredia, who manages IBM's serverless computing offering named OpenWhisk. There's bare metal, virtual machines and containers. "For certain workloads, we wanted to abstract away all that management so that you can execute your code without worrying about the infrastructure or management of the servers. That's serverless."
Now industry analysts, proponents and skeptics are debating just how big of a deal this technology is. Is it evolutionary or revolutionary? Will it be used to power most future applications, or just a subset of use cases? The answer, for now, is that the market is in its earliest days, so it's difficult to say. But the hype, interest and potential benefits of this technology should not be ignored.
Pros of serverless
Amazon Web Services is largely credited with starting the serverless market hype in 2014 when the company introduced Lambda, its serverless computing product.
General Manager of AWS Strategy Matt Wood said the product was inspired by one of the company's most popular products: Simple Storage Service (S3).
Blogger Sam Kroonenburg says the relationship between S3 and Lambda is an important analogy. "S3 deals in objects for storage. You provide an object and S3 stores it. You don't know how, you don't know where. You don't care. There are no drives to concern yourself with. There's no such thing as disk space… All of this is abstracted away. You cannot over-provision or under-provision storage capacity in S3. It just is," Kroonenburg explains in his A Cloud Guru Blog.
Wood says AWS wanted to take that same philosophy to computing. "Lambda deals in functions. You provide function code and Lambda executes it on demand…. You cannot over provision, or under provision execution capacity in Lambda. It just is."
In a traditional IaaS cloud environment customers provision virtual machines, storage, databases and all the security and management tools to go along with it. They load applications on to those VMs, and then they use tools like load balancers to scale them. They use management software to optimize their instance sizes and find virtual machines that have been left on by accident. Lambda and other FaaS platforms offer a different model. Code is written in functions. When an event happens that triggers that function Lambda runs it. That's it. No capacity planning, no load balancing; just tasks being executed.