神刀安全网

Why You Should Go Serverless in the Cloud

Why You Should Go Serverless in the Cloud

The cloud is all about abstraction: Users get the technology services they need from “somewhere out there” without having to worry about the actual hardware and other components involved. However, lots of developers still have to think carefully about what server processing and storage to provision and manage when they build, test and run applications in the cloud.

A new trend, “serverless computing,” takes application development up another abstraction notch. Serverless computing isn’t really serverless: Programming functions still have to run on server and storage hardware. The term “serverless” means that developers or DevOps teams no longer have to worry about what and how much server compute and storage to provision and manage for their application development, deployment and scalability.

Now, with various cloud providers offering APIs to control the infrastructure and its components, enterprises have a new layer of abstraction. In fact, they don’t even have to run all the code together on the same hardware resources or even with the same cloud service. Instead, they can slice and dice their application development into discrete functions—snippets of programming code that run a task in response to events—and let the automated resource scheduler (public cloud based or in-house private cloud ) worry about when, where and how to run and scale them. This truly allows any enterprise to enable a “provision-as-you go” model within their ecosystem.

Why Is Serverless Gaining Momentum?

Amazon introduced the concept of serverless computing first with its Lambda service. Lambda’s serverless approach lets developers focus on the functions and DevOps teams focus on the hardware. When a serverless function is called, Lambda provisions the necessary hardware and other resources in the cloud automatically, verifies against the required security measures, and bills the user based on the number of requests and compute time used in increments of 100 milliseconds. You no longer have to pay by the hour to run a certain level of processing power. And, if the function is never called, you don’t pay anything.

Other big players—IBM, Microsoft and Google—are in various stages of following suit with OpenWisk, Azure Functions and Cloud Functions, respectively. Meanwhile, Iron.io’s platform, IronWorker, leverages Docker containers to support a huge number of programming languages.

In a nutshell, the three key reasons that serverless computing is gaining so much momentum are:

  1. The enterprise movement toward enterprise hybrid/multi-cloud deployments requires more granular control over how resources are being allocated.
  2. The availability of more API-based platforms makes it easier for enterprises to program to serverless computing and increases stickiness.
  3. The increasing adoption of Platform-as-a-Service makes it easier and more cost-effective to manage and control application development resources as OPEX, rather than CAPEX

Turning on the Tap for Continuous Development

There’s high interest in serverless computing among mobile and IoT developers, who see it as an obvious step toward a cloud utility ideal similar to turning a dial to get more water. In the footsteps of containers, serverless computing simplifies software development and deployment and adds more power to event-driven computing. In fact, most serverless providers provision each uploaded code snippet as a container to achieve economies of scale on-demand versus just provisioning static virtual machines alone. In the blog article, “ Continuous Application Delivery with Containers at Equinix ,” we describe how we were able to achieve continuous application delivery through containerization using Docker and Jenkins.

Of course, distributing and running software functions in the cloud can be a performance killer without fast, reliable interconnections. Andrew Reichman, research director at 451 Research, said viable production use cases will require “more service-level agreement language around latency.” Direct and secure interconnection that bypasses the performance bottlenecks and security risks of the public Internet and enable multi-cloud interconnection is also prescribed.

Serverless computing hasn’t matured yet. But if you are considering running your workload in an event-based process in the cloud and want it to be scalable on-demand, then it’s your best bet

转载本站任何文章请注明:转载至神刀安全网,谢谢神刀安全网 » Why You Should Go Serverless in the Cloud

分享到:更多 ()

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址