Serverless Computing

Serverless computing differs from traditional cloud computing concepts (we refer to them as serverful in this paper) in the sense that the infrastructure and the platforms in which the services are running are hidden from customers. In this approach, the customers are only concerned with the desired functionality of their application and the rest is delegated to the service provider

The aim of the serverless services is threefold:

  1. relieve the users of cloud services from dealing with the infrastructures or the platforms,
  2. convert the billing model to the pay-as-you-go model,
  3. auto-scale the service per customers’ demand.

As a result in a truly serverless application, the execution infrastructure is hidden from the customer and the customer only pays for the resources they actually use. The service is designed such that it can handle request surges rapidly by scaling automatically.

A serverless service can be viewed as a generalization of FaaS and BaaS that incorporates the following characteristics:

  • The execution environment should be hidden from the customer i.e. the computation node, the virtual machine, the container, its operating system and etc. are all hidden from the customer.
  • The provider should provide an auto-scaling service i.e. the resources should be made available to the customer instantly per demand.
  • The billing mechanism should only reflect the number of resources the customer actually uses i.e. pay-as-you-go billing model.
  • The provider does its best effort to complete the customer’s task as soon as it receives the request and the execution duration is bounded.
  • The basic elements in serverless services are functions. The functions are not hidden from the provider. The provider knows their dependencies to external libraries, run-time environments, and state during and after execution.

A serverless application is usually comprised of two parts:

  1. The client implements most of the application logic. It interacts with two sides i.e. the end-user and the provider, invoking functions on one side and translating the results into usable views for the other side.
  2. Registered functions on a provider. Functions are uploaded to the provider. The provider invokes a copy of the function according to the user’s request or based on a predefined event

Serverless architecture characteristics

The solution is the serverless architecture. With this approach, you don’t have to purchase, rent, or provision servers or virtual machines to run your code. Serverless architectures defined by these features:

  • Built using services (not servers or virtual machines) provided by CSP and fully managed by the CSP
  • Services that support elasticity and fault tolerance while providing enterprise-level global security
  • Automatic scaling n Built-in high availability
  • Integrated security
  • Event-driven compute to execute business logic (examples include AWS Lambda, Azure functions, IBM Bluemix OpenWhisk, Google Cloud Functions
  • Pay-as-you go fee structure (for consumed services only)
  • Developer-productivity centric
  • Promotes continuous build, integration, and deployment efforts
  • Innovation by focusing on developing and deploying business functionality
  • Reduced time to market

Two key drivers

Advances in computing technology have enabled the serverless environment, and enterprises are embracing it because of two key drivers:

New business models:

The digital era has greatly accelerated the pace of change and evolution for new products, services, and business models. Market dynamics are changing faster than ever before. Enterprises are under increasing pressure to release new features and products that meet the exponentially growing expectations of customers. Digital reimagination and the Internet of Things are affecting every industry. This new environment makes the idea of serverless architecture more attractive than ever before. It effectively eliminates the time-consuming and expensive traditional approach: building components in-house, purchasing new hardware, installing servers, configuring, and troubleshooting.

A paradigm shift in the organizational mindset:

DevOps and the agile development culture are driving organizations to fundamentally change the way they develop business applications. Monolithic applications are giving way to micro services, API, and function-based execution units. The trend of Continuous Integration and Continuous Deployment (CI-CD) is helping IT match pace with business agility. The long cycles of change management and waterfall execution models are giving way to weekly or bi-weekly sprints of smaller scrum teams pushing new code changes.

For many enterprises, the big questions today are these:

  1. How can my infrastructure support this agile approach characterized by frequent but small changes?
  2. How can we provide developers and operations teams with an infrastructure that can self-manage, scale on demand, and provide the compute and storage capacity needed to grow?

These trends have fundamentally changed the way enterprises approach infrastructure. The cloud has enabled organizations to build their own enterprise-grade infrastructure at commodity cost. Nonetheless, whether servers reside on premise or in cloud, you still need to invest in creating, maintaining, and managing the infrastructure to ensure high availability, scalability, and elasticity. For enterprise developers, the resulting dependency on infrastructure in the cloud makes the development and release process slow.

OPPORTUNITIES

The opportunities that serverless computing offers.

No deployment and maintenance complexity: 

The very first and foremost opportunity that serverless computing offers is to relieve users from managing the infrastructure, which is now already accomplished in Infrastructure-as-a-Service (IaaS), however, users still have to manage their virtual resources i.e, installing and configuring related packages and libraries. Certainly, Platform-as-a-Service (PaaS) providers such as Heroku have made the management slightly easier, although, users still have to configure the application to match the PaaS requirements which is not a trivial task. Serverless computing takes a big step in this manner. Users only have to register their functions and then receive the credentials to invoke the functions.

Affordable scalability:

Another promise of the cloud computing is the ability for customers to deploy the functionalities of their applications without worrying about the scalability of the execution infrastructure or platform. The scalability is a direct result of the auto-scaling nature of these services i.e. per request the service invokes a copy of the requested function and there is virtually no bound for the number of concurrent requests. Each invocation is assigned to the most feasible and available resource for execution. The affordability of serverless services is mainly due to the reduced costs of the providers.

There is two main reasons for the more reduced costs,

(1) resource multiplexing 

(2) infrastructure heterogeneity.

Resource multiplexing leads to higher utilization of available resources. For example, consider the case in which an application has one request every minute and it takes milliseconds to complete each request. In this case, the mean CPU usage is very low. If the application is deployed to a dedicated machine then this is highly inefficient. Many similar applications could all share that one machine. Infrastructure heterogeneity means that the providers also can use their older machines that are less attractive for other direct services to reduce their costs (mainly because the execution environment is a black-box for the customer). It is important to note that the affordability of serverless services depends on the usage scenario i.e. in some cases renting a virtual machine is cheaper than using serverless’s pay as you go model.

New market places: 

With the advent of modern operating systems for mobile devices such as Android or iOS, various market places for applications, that are specifically designed for those operating systems, have emerged such as Google Play Store and Apple’s App Store. Such a scenario is already appearing for serverless paradigm i.e. with the growth in the popularity of serverless computing, new market places for functions has emerged. In these types of markets, developers can sell their developed functions to others. Every generalized or domain-specific functionality can be bought or offered in those markets. For example, a software developer may need a geospatial function that checks whether a point resides inside a geospatial polygon. They could buy such functions from those markets. AWS Serverless Application Repository is an example of such a capability. Presents a quantitative analysis of functions available inside AWS Application Repository. The competition forced by the economics of these markets will lead to high-quality functions i.e. both from the perspective of code efficiency, cleanness, documentation, and resource usage. The function markets may present buyers with a catalog for every function which shows the resource usage of the function and prices it incurs per request. Thus, the buyer can choose from many options for a specific task.

APPLICATIONS

Many real-world serverless applications have been proposed in the literature during the past few years. We categorize these applications into following domains.

Real-time collaboration and analytics:

The stateless nature of serverless services makes them an attractive platform for real-time collaboration tools such as instant messaging and chatbots. An architecture for chatbot on OpenWhisk. An XMPP-based serverless approach for instant messaging is also introduced. Real-time tracking is another example of collaboration tools that are very suitable for serverless services as these applications are not heavily dependant on the system’s state. Two real-time GPS tracking methods on low-power processors. Serverless services are also utilized for data analytics applications. In these applications, various sources stream real-time data to a serverless service. The service gathers, analyses, and then represents the data analytics. The auto-scaling feature of serverless computing makes the handling of concurrent massive data streams, possible. Proposed Lambada which is a serverless data analytics approach that is one order of magnitude faster and two orders of magnitude cheaper compared to commercial Query-as-a-Service systems.

Urban and industrial management systems: 

The pay-as-you-go model of serverless services paved the way for the introduction and implementation of various budget-restricted urban and industrial management systems. An urban smart waste management system. A serverless service for oil and gas field management system. An implementation of a serverless GIS platform for land valuation is presented. The distributed nature and auto-scaling feature of serverless services make it an apt choice for smart grids. The event-driven serverless services to handle SCADA/EMS failure events. A distributed data aggregation and analytics approach for smart grids. Serverless services have been also utilized for urban disaster recovery applications. A community formation method after disasters using serverless services. The migration toward serverless paradigm seems a reasonable choice for this domain of applications, especially, for public sector services or for developing countries due to its lower deployment overheads and also its pay-as-you-go pricing model.

Scientific computing: 

Serverless computing is not an attractive alternative for scientific computing applications, many studies have focused their attention toward serverless services for those applications. We believe disagreement lies in the fact that the range of scientific computing and its applications are vast and there are certainly some areas in this domain for which the utilization of serverless services is feasible. Argue that serverless approaches provide a more efficient platform for scientific and high-performance computing by presenting various prototypes and their respective measurements. This idea is also echoed, where high-performance Function-as-a-Service is proposed for scientific applications. A serverless tool used for linear algebra problems. Serverless Computing: A Survey of Opportunities, Challenges, and Applications and a case for matrix multiplication is presented. Serverless paradigm is harnessed for large-scale optimization. Serverless approaches have been also used in DNA and RNA computing, utilized the potentials of serverless paradigm in all-against-all pairwise comparison among all unique human proteins. On-demand high-performance serverless infrastructures and approaches for biomedical computing. Scientific applications that require extensive fine-grained communication are difficult to support with a serverless approach, whereas those that have limited or coarse-grained communication are good candidates. Also, note that scientific computations with time-varying resource demands will benefit from migrating to a serverless paradigm.

Artificial intelligence and machine learning: 

Machine learning in general and neural network-based learning, in particular, are currently one of the most attractive research trends. The suitability of the serverless paradigm for this domain has received mixed reactions both from research and industrial communities. For example, it has been argued that deep learning functions are tightly coupled (they require extensive communication between functions), and also these functions are usually compute and memory intensive, as such, the paradigm is not promising for these applications. Nevertheless, it has been discussed that deep neural networks can benefit from serverless paradigms as they allow users to decompose complex model training into several functions without managing virtual machines or servers. 

Video processing and streaming: 

Serverless approaches have been proposed for video processing. A serverless video processing framework that exploits intra-video parallelism to achieve low latency and low cost. The authors claim that a video with 1,000-way concurrency using Amazon Lambda on a full-length HD movie costs about $3 per hour of processed video. It achieves 7.9 times lower latency and 17.2 times cost reduction on average compared to that of serverful alternatives. GPU processing power is harnessed in a serverless setting for video processing. A measurement study to extract contributing factors such as the execution duration and monetary cost of serverless video processing approaches. Serverless video processing and broadcasting applications have gained much traction both from industrial and research communities during the COVID-19 pandemic.

System and software security:

The power of serverless computing has been leveraged for providing security for various software systems and infrastructures. A mechanism for securing Linux containers has been proposed. Serverless services have also been utilized for intrusion detection. A serverless, real-time intrusion detection engine built upon Amazon Lambda. A serverless malware detection approach using deep learning. Serverless approaches have been also used for ensuring data security. A method for automatically securing sensitive data in the public cloud using serverless architectures has been introduced. We believe that the serverless approach has great potentials to improve the security of systems and services. This is due to various reasons:

(1) Security threats are often ad-hoc in nature. In these cases, the pay-as-you-go pricing leads to reduced costs.

(2) Some of the attacks exhibit sudden traffic bursts. The auto-scaling feature of the serverless services facilitates the handling of such a scenario.

(3) Attackers may conduct widespread attacks interrupting various components and infrastructures of the victim.

Serverless functions are standalone in the sense that the functions can be executed in various execution environments. We think that preventing attackers from using serverless infrastructures to conduct these types of attacks is an important issue that needs to be addressed.

Internet of Things (IoT):

The serverless computing paradigm has been exploited for various IoT domains. Using a real-world dataset, showed that a serverless approach to manage IoT traffic is feasible and utilizes fewer resources than a typical serverful approach. A serverless fog computing approach to support data-centric IoT services. A smart Internet of Things (IoT) approach using the serverless and microservice architecture. Serverless paradigms also have been utilized for coordination control platforms for UAV swarms . In another research direction, a flexible and intuitive serverless platform for IoT. A decentralized framework for serverless edge computing in the Internet of Things is presented. The objective of the paper is to form a decentralized FaaS-like execution environment (using in-network executors) and to efficiently dispatch tasks to minimize the response times. A framework that brings FaaS to microcontroller-based IoT devices using a Python runtime system. It is reasonable to confer that serverless services can act as feasible back-ends for IoT applications that have infrequent and sporadic requests. For the scenarios where rapid unpredictable surges of requests emerge, serverless services can conveniently handle requests as they can auto-scale rapidly.

BENEFITS

Reduce costs:

Like cloud services, serverless is a new way of of loading IT overhead. A serverless architecture eliminates the responsibility of managing servers, databases, and even application logic, reducing set-up and maintenance costs. You only pay for the time your code executes, reducing operational costs. Serverless architecture lowers cloud administration cost (cloud server management and associated people costs).

Rapid development and deployment:

Serverless architectures are built to enhance developer productivity and to make build, test and release cycles inherently agile. With the serverless approach, you can do as many test runs as you like without having to worry about when your infrastructure will be ready or when other components in the solution will be available for rollout. Cloud service providers are also investing to standardize development environments to encourage use of serverless architectures (such as the 2016 announcement of AWS Lambda supporting C#).

Reduced time to market:

By using a serverless architecture, you can transform ideas into reality in a matter of minutes or hours. Serverless architectures also enable running multiple versions of code to meet tight deadlines. For example: To develop a functionality that returns credit score for mobile users as part of your mobile banking app, a traditional cloud IaaS model (such as AWS EC2) could requires days or even a week for developing, testing, and delivering the functionality. Using AWS Lambda (serverless, event-driven computing) you can develop the same functionality in matter of few hours. It takes just a few clicks to provision serverless services with scaling, fault tolerance, and elasticity all built in.

Built-In scaling:

Like cloud services, serverless offerings have built-in scalability. There’s no need for guesswork when it comes to scaling policies or over-/under-provisioning concerns. Just pay for the service usage, and the serverless architecture infrastructure will grow or shrink based on demand.

Failover:

Disaster recovery is integrated into CSP offerings. Because serverless components are based on the pay-per-use model, setting up failover infrastructure in paired regions of a given geography comes at fraction of the cost of the traditional server-based architecture. The additional benefit is bringing the recovery time (RTO) down to near zero, making seamless switchover a possibility at fractional cost of existing setups.

RISKS

Loss of control over infrastructure:

The cloud service provider controls the underlying infrastructure, so you will not be able to customize/optimize the infrastructure to suit specific needs. CSP-established service limits for serverless components may challenge the applicability for your use case. Multiple customers sharing the same serverless architecture may raise security concerns. CSPs are addressing these concerns by allowing customer to use serverless offerings in a virtual private network.

Lock-In:

Switching from one vendor’s serverless offering to another’s may require significant time and efforts. (TCS Digital Enterprise offers frameworks and professional services to help customers choose the right set of serverless components and make cloud portability a possibility.)

Compliance concerns:

CSPs are responsible for doing vulnerability scanning and penetration tests on infrastructure underlying serverless offerings. But as a consumer of serverless offerings, you cannot do these tests. For example, you cannot perform penetration test on underlying infrastructure for your AWS Lambda function. For most customers this may not be an issue but if your use case requires you to perform penetration tests on infrastructure for compliance, legal reasons you may prefer a more traditional, server-based approach. Monitoring, logging and debugging: Monitoring, logging and debugging of serverless architecture may often need customized code and/or third-party software adding more costs.

Conclusion

Adopting serverless can deliver many benefits—but the road to serverless can get challenging depending on the use case. And like any new technology innovations, serverless architectures will evolve en route to becoming a well-established obvious standard. While serverless architecture may not be a solution to every IT problem, it surely represents the future of many kinds of computing solutions in the coming years.