Design, deliver, and run enterprise blockchain workloads quickly and easily.
Containers and microservices and serverless, oh my!
Once upon a time, virtual machines changed how we thought about servers. Then, the cloud changed how we thought about IT. Now, containers have started a new transformation. The latest entry is “serverless”—though I should point out immediately that the term serverless is a misnomer. Future cloud-native applications will consist of both microservices and functions, often wrapped as Linux containers.
VMs and the cloud enabled DevOps, the practice of developers and IT operations staff collaborating to optimize technology processes. Cloud technologies’ dynamic compute and storage resources made it easier to provision resources. The idea behind DevOps is that developers no longer need to worry about infrastructure because that's taken care of in the background by programs such as Ansible, Chef, and Puppet.
Then along came containers. Containers use far fewer resources than VMs by using shared operating systems. Containers are also easier to spin up and down when circumstances require it.
Developers can create programs that run as components in larger applications. This abstraction fundamentally changes the options in building distributed applications. It means developers can isolate dependencies and use well-tuned, smaller components. Containers let you pack an application and all its dependencies into a single package that can run almost anywhere. For example, Microsoft just introduced Azure Container Instances, which enables users to run both Linux and Windows containers on Azure.
Container orchestration is the starting point
Serverless, function as a service (FaaS), microservices, and container orchestration all spring from cloud-based containers.
While you can treat containers as fast, efficient VMs, that's not making the most of the technology. They provide the best bang for the buck when you scale multiple, colocated containers to deliver services in a production environment.
But using containers extensively means you'll be running far more containers. And, in turn, it drastically increases the complexity of managing them. For example, you need to answer questions like, “Has the base operating system in our containers been updated with all the latest patches?”
Managing that manually isn’t feasible, so it’s wise to turn to container orchestration programs such as Docker swarm mode, Mesosphere, and Kubernetes to help you deploy and manage at scale. Orchestration enables you to build application services that span multiple containers, schedule containers across a cluster, scale containers, and manage the containers' health.
Serverless, which isn’t actually without a server
Instead of using containers to run applications, serverless computing replaces containers with another abstraction layer. Its functions or back-end services are one-job programs, which use compute resources without worrying the developer. Instead of calling functions in the traditional sense, in serverless, a developer calls a working program to provide a service for the program they're building.
The Cloud Native Computing Foundation (CNCF) Serverless Working Group defines serverless computing as "building and running applications that do not require server management. It describes a finer-grained deployment model where applications, bundled as one or more functions, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment.”
Or for another definition: "Serverless architectures refer to applications that significantly depend on third-party services,” says Mike Roberts, engineering leader and co-founder of Symphonia, a serverless and cloud architecture consultancy. “By using these ideas, and by moving much behavior to the front end, such architectures remove the need for the traditional 'always on' server system sitting behind an application. Depending on the circumstances, such systems can significantly reduce operational cost and complexity at a cost of vendor dependencies and (at the moment) immaturity of supporting services.”
So, for an intentionally trivial example, say you write graphic conversion code. Your applications regularly need thumbnails for managing JPG or PNG image files, so you write the application in a serverless-compatible language such as Node.js or Go. You then convert this program into a serverless function deployment package.
Then, when you need to create thumbnails, you set it to run whenever a compatible graphics file is placed in, for example, an Amazon Web Services (AWS) S3 bucket. Now, whenever such a file is uploaded, your graphic serverless function automatically converts the files while scaling up the resources the program needs to do its job.
Thus, in serverless, horizontal scaling is completely automatic, elastic, and provider managed. If your system needs to process 100 requests in parallel, the provider handles that without any additional work from the developer. Also, unlike a traditional server-based application, there is no constantly running server. Compute time is spent only when the serverless function is called to action.
Make no mistake: This code is not running on pixie dust. Behind serverless computing are servers, VMs, and containers. Serverless computing is another layer of abstraction atop a cloud and container-based infrastructure. Ideally, serverless applications don't require developers to provision, scale, or manage any servers. Serverless began with AWS Lambda; other serverless services include Google Cloud Functions and Azure Functions.
Serverless is the next step up from earlier cloud technologies, such as infrastructure-as-a-service cloud. There, you need not worry about servers and storage. That's the cloud provider's job. With serverless, you don't need to care about your compute resources.
With serverless, the service executes your code using the predefined compute resources needed for your job. When a predefined event or HTTP request calls the code, the serverless platform executes the task. You don't have to tell the serverless provider how many times these events or functions will occur.
In short, with serverless, compute management is abstracted away. You can execute your code without worrying about the infrastructure or server management.
Serverless is not a one-size-fits-all technology. Not every program can run in a serverless environment. It must be custom code that can run in ephemeral containers, an architecture sometimes called function as a service (FaaS).
In FaaS, developers deploy small applications that are initiated and run when they're called by events or HTTP requests."Functions are a single-purpose block of code,” explains Chad Arimura, CEO of developer tool company Iron.io. “It’s something you can conceptualize easily: Process an image, transform a piece of data, encode a piece of video, and so on." FaaS code is often (but not always) written in Node.js.
Serverless is new, which means you need to consider its weaknesses before you jump in with both feet.
One of these is portability. A key difference between orchestration and a serverless approach is that orchestration is far more portable. An application built around Lambda, for instance, can't easily be ported to Azure Functions.
This can be a real problem. A serverless approach can lock you in with a specific cloud vendor. That’s particularly onerous because one promise of containers and clouds is to avoid vendor lock-in.
The tools, especially for API gateways, are achingly immature, as Roberts mentions: “So while defining applications with API gateways is possible, it’s most definitely not for the faint-hearted." But he continues, "There are exceptions, however—one example is Auth0 Webtask, which places significant priority on developer UX in its tooling."
Another serverless approach is back end as a service (BaaS). In this API-based model, services autoscale; the containers behind them are transparent to developers and end users.
According to the CNCF, serverless computing works best when workloads are:
- Asynchronous, concurrent, and easy to parallelize into independent units of work.
- Infrequent or with sporadic demand, with large, unpredictable variance in scaling requirements.
- Stateless and ephemeral, without a major need for instantaneous cold start time.
- Highly dynamic, in terms of changing business requirements that drive a need for accelerated developer velocity.
Serverless computing sounds a lot like microservices, doesn't it? Microservices is an architectural style where applications are made up from loosely coupled services or modules. It lends itself to the continuous integration/continuous deployment applied to the production and management of large, complex applications.
Each microservice provides an API endpoint, connected by lightweight protocols such as REST or gRPC for communicating between each other. Data tends to be represented by JSON or Protocol Buffers.
With microservices, developers don’t know what’s going on under the hood, nor do they care if that service is powered by functions or FaaS. Functions are the building blocks, while the service provides the API.
If you've been in technology for a while, microservices may remind you of service-oriented architecture (SOA). And you'd be right. Everything old is new again.
In SOA, a service is a well-defined, self-contained function that doesn't depend on the context or state of other services. SOA services are designed to trade data and coordinate with one other, using connections called enterprise service busses. Data is transmitted and received in XML.
The protocols have changed over time. Initially, SOA used object-oriented protocols such as Microsoft's Distributed Component Object Model (DCOM) or object request brokers (ORBs). Today, typical messaging services are Java Message Service (JMS) or Advanced Message Queuing Protocol (AMQP).
The difference between SOA and microservices? With microservices, you're working with a true decoupled architecture. Microservices are also lighter than SOA and thus more flexible. While SOA services are deployed to servers and VMs, microservices are deployed in containers.
All these technologies play a role in creating cloud-native computing—that is, software designed from the first to run online. Containerizing an application is not enough to make it “cloud native.”
According to the CNCF—which should know!—cloud-native computing is:
- Containerized. Each part (applications, processes, etc.) is packaged in its own container. This facilitates reproducibility, transparency, and resource isolation.
- Dynamically orchestrated. To optimize resource utilization, containers are actively scheduled and managed.
- Microservices-oriented. Applications are segmented into microservices. This significantly increases the overall agility and maintainability of applications.
"The classic application is a monolithic stack that isn't agile and is fixed in place. The cloud-native approach is about cutting up the various components of application delivery,” explains Dan Kohn, executive director of the CNCF. “Cloud native uses open source software stacks to deploy an application as microservices, packaging each part into its own containers and dynamically orchestrating those containers to optimize resource utilization."
You can use serverless to obtain the same results. Serverless is yet another abstraction layer to keep the dirty details of infrastructure away from developers.
What’s best? When? Where?
Someone needs to keep an eye on things—and that's where all these technologies have problems.
For example, someone must monitor multiple microservices, and for each service, there might be several instances that run in parallel. This adds new levels of complexity to monitoring.
These technologies also raise new storage concerns. As Chris Aniszczyk, the CNCF's chief operating officer, says, “To build successful cloud-native applications, developers have to consider new ways of managing storage." In particular, they need to build and define applications that separate the applications from their data.
Still, these new methods are radically changing how we create, run, and consume software. Every IT department should evaluate how best to integrate them into its software stack.
Containers, microservices, and serverless: Lessons for leaders
- These technologies are changing application deployments and the software design behind them.
- Containers use fewer resources than virtual machines and are easier to spin up and down. They enable developers to isolate dependencies and use well-tuned, smaller components.
- With serverless computing, applications are treated as functions that can be executed on demand.
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.