A Serverless future outside the cloud?

Serverless is now considered as not just Function-as-a-Service (FaaS), but rather as a new application development method that relies on ephemeral compute and, most importantly, on managed services. Serverless developers are happy to rely on ready-to-use solutions that spare them the effort of managing the key components of their infrastructure. Today, it is a service provided by only major cloud providers... But could we imagine Serverless services outside the range of cloud providers?

Self-managed Serverless

The first step towards cloudless Serverless would be self-managed Serverless. It would give companies all the advantages of Serverless (simplicity of use, auto-scaling, scaling to zero …) without being bound to any cloud provider. Open-source solutions like Knative, Open FaaS, or Keda help run Kubernetes to perform tasks using serverless resources. You might think that it is not truly Serverless because infrastructure management is still required and these solutions offer close to no managed services but if a company wants to run its infrastructure on-premise or using a hybrid-cloud or multi-cloud approach while still providing its developers with a serverless experience, then from the developer's point of view, this is Serverless.

Apart from enabling simpler deployment and management capacity for developers, having serverless systems at a company level can be cost-efficient thanks to the mutualization of infrastructures.

The benefits are twofold, for the service consumers, who only pay for the resources used, and for the cloud provider, which maximizes its resource usage. However, in a self-managed context, the benefits both apply to the organization which has set it up.

Scaleway Serverless Service Simplified Architecture

Serverless enables you to use the available resources to the best of their capacities.

Self-hosting your serverless solution gives you the opportunity to optimize performance: for example, you would be able to choose the hardware, and adapt it to your needs. The security constraints are less challenging on shared servers than with a cloud provider since you can trust the software which runs on your infrastructure.

At the end of the day, self-hosting can be a viable solution, but it requires dedicated resources (infrastructure and human resources) to ensure service availability.

Serverless for edge applications

A second step would be to look at the edge. The edge refers to every system between the cloud provider and the end user, from Content Delivery Networks, Communication Service Provider, to customer-owned edge networks (factories, offices, retail store servers) and even IoT.

The ability to perform most operations closer to the data source/user enables lower latency and better data privacy (operations can be run “on-site”), as well as network cost savings.

Serverless is a very promising solution for edge applications as it allows to perform tasks on-site instead of a data center, as is usually the case, using only the required resources.
For example, some operations like verifying a signature, hashing data, or even performing a simple media transformation require faster response times and do not necessarily require a lot of computing power; they can therefore be run on Content Delivery Networks where some of the data is replicated.

To be run at the edge, however, serverless workloads need to be lighter and start faster than they do in the cloud, but also to get better at abstracting infrastructure to be able to run the workload on lighter nodes that the cloud provider doesn’t necessarily manage directly.

This specificity has enabled the emergence of new technologies relying on Web Assembly (WASM), a binary instruction format (into which programs are compiled) that is also an open-web standard supported by every modern web navigator, on the server side. Providing strong service isolation WASM through its sandbox, while enabling it to run software at near-native speed, Web Assembly is getting more and more popular in both the web development sector and the cloud industry, as several CNCF projects aiming at offering WASM benefit from the edge to the cloud (Krustlet, WASM cloud). Serverless promises to erase the notion of cold start. According to CloudFlare/Fastly, the startup time of their WASM-based service is lower than the TLS handshake which makes it instantaneous for their user.

source: Cosmonic Corp

Decentralized computing and Web 3.0 on Serverless

Finally, Serverless helped switch the paradigm that developers and organizations needed to master their entire stack from hardware to software in order to run high performing software. They could use managed services and focus on their core development tasks instead of managing computing resources. This first abstraction of the underlying infrastructure is culturally paving the way towards a new paradigm which is at the core of Web 3.0: decentralized computing.

Decentralized Applications, also known as DApps, run on blockchain technology, such as Ethereum or Dfinity. And blockchain technology is truly serverless, as it runs on a federation of nodes (the world computer) rather than on a single server. With such a system, you will no longer need to trust a third party to guarantee that financial transactions are completed contracts enforced or simply that your data is stored. Your application runs as soon as it is deployed and you don’t have to worry about one of your providers failing. The end user offers in terms of web 3.0 are still limited to specific use cases like storing information (digital or physical asset ownership contracts, financial transactions, etc.) as well as existing web 2.0 services, now using decentralized tools for better privacy and fairer revenue distribution.

Source: Schema of the current dependance of decentralized application from cloud provider

For now, most DApps only rely on blockchain to perform transactions through smart contracts while using conventional services to host their front and backend as blockchain based response time is not at the web industry current levels.

Thanks to many recent developments, developers increasingly use fully decentralized solutions to run their applications.

For example, IPFS (InterPlanetary File System) makes it possible to store files like images and frontend code on its fully decentralized network. It makes them immutable, impossible to remove, and helps reduce bandwidth costs since the content is delivered from the closest node.

So can Serverless be cloudless?

From a cloud provider perspective, even though those trends might seem threatening as they create substitutes for cloud-based infrastructure, they actually represent an opportunity for us to benefit from new technologies to develop and enhance our own products. For example, our serverless products (Serverless Containers and Serverless Functions) are based on open-source technologies that can be used to develop self-hosted serverless solutions and we are looking into WASM to enhance our performance. In the same way, blockchain-based software can be used to develop our resiliency while better distributing resources and data across our data centers, improving security.

From a user point of view, it is easier to start testing new features or products with managed services like Scaleway’s Serverless Functions or Containers before switching to self-hosted, multi-cloud, or distributed solutions.

Recommended articles