NavigationContentFooter
Jump toSuggest an edit

Functions - Concepts

Reviewed on 14 November 2024

Build step

Before deploying Serverless Functions, they have to be built. This step occurs during deployment.

Once the Function is built into an image, it will be pushed to Container Registry

Cold Start

Cold Start is the time a function takes to handle a request when it is called for the first time.

Startup process steps are:

  • Downloading the container image (which contains the built Function) to our infrastructure
  • Starting the container and the runtime
  • Waiting for the container to be ready.

How to reduce cold starts

Container Registry

Container Registry is the place where your images of your Serverless Functions are stored before being deployed.

CRON trigger

A CRON trigger is a mechanism used to automatically invoke a Serverless Function at a specific time on a recurring schedule. It works similarly to a traditional Linux cron job, using the * * * * * format, and uses the UTC time zone. Refer to our cron schedules reference for more information.

Custom domain

By default, a generated endpoint is assigned to your Serverless resource. Custom domains allows you to use your own domain - see our custom domain documentation for full details.

Endpoint

An endpoint is the URL generated to access your resource. It can be customized with custom domains.

Environment variables

Environment variables are key/value pairs injected in your container. They are useful for sharing information such as configurations with your container. Some names are reserved. See details on reserved names.

GB-s

Unit used to measure the resource consumption of a Serverless Function. It reflects the amount of memory consumed over time.

JWT Token

JWT (JSON Web Token) is an access token you can create from the console or API to enable an application to access your private container. Find out how to secure a Function.

Handler

A handler is a routine/function/method that processes specific events. Upon invoking your function, the handler is executed and returns an output. Refer to our dedicated documentation for more information on the structure of a handler.

Instance

A Serverless Function instance handles incoming requests based on factors like the request volume, min scale, and max scale parameters.

Load balancing

The Serverless infrastructure manages incoming request traffic. In scenarios like sudden traffic spikes or load testing, resources are automatically scaled based on the max scale parameter to handle the load.

Logging

Serverless offers a built-in logging system based on Scaleway Cockpit to track the activity of your resources: see monitoring Serverless Functions.

Max scale

This parameter sets the maximum number of function instances. You should adjust it based on your function’s traffic spikes, keeping in mind that you may wish to limit the max scale to manage costs effectively.

Metrics

Performance metrics for your Serverless resources are natively available: see monitoring Serverless Functions).

Min scale

Customizing the minimum scale for Serverless can help ensure that an instance remains pre-allocated and ready to handle requests, reducing delays associated with cold starts. However, this setting also impacts the costs of your Serverless Function.

Namespace

A namespace is a project that allows you to group your functions.

Functions in the same namespace can share environment variables and access tokens, defined at the namespace level.

NATS trigger

A NATS trigger is a mechanism that connects a function to a NATS subject and invokes the function automatically whenever a message is published to the subject.

For each message that is sent to a NATS subject, the NATS trigger reads the message and invokes the associated function with the message as the input parameter. The function can then process the message and perform any required actions, such as updating a database or sending a notification.

Privacy policy

A function’s privacy policy defines whether a function may be executed anonymously (public) or only via an authentication mechanism provided by the Scaleway API (private).

Queue trigger

A queue trigger is a mechanism that connects a function to a queue created with Scaleway Queues, and invokes the function automatically whenever a message is added to the queue.

For each message that is sent to a queue, the trigger reads the message and invokes the associated function with the message as the input parameter. The function can then process the message and perform any required actions, such as updating a database or sending a notification.

Rolling update

When deploying a new version of a Serverless Function, a rolling update is applied by default. This means that the new version of the service is gradually rolled out to your users without downtime. Here is how it works:

  • When a new version of your function is deployed, the platform automatically starts routing traffic to the new version incrementally, while still serving requests from the old version until the new one is fully deployed.
  • Once the new version is successfully running, we gradually shift all traffic to it, ensuring zero downtime.
  • The old version is decommissioned once the new version is fully serving traffic.

This process ensures a seamless update experience, minimizing user disruption during deployments. If needed, you can also manage traffic splitting between versions during the update process, allowing you to test new versions with a subset of traffic before fully migrating to them.

Runtime

The runtime is the execution environment of your function. Regarding Serverless Function, it consists of the languages in which your code is written.

Sandbox

A sandbox is an isolation area for your function. Serverless Functions offer two sandboxing environments:

  • v2 - Recommended for faster cold starts.
  • v1 - Legacy sandboxing with slower cold start, but fully supports Linux system call interface.

Scale to zero

One of the advantages of Serverless Functions is that when your function is not triggered, it does not consume any resources, which allows for significant savings.

Scaling

Serverless Functions make scaling your application transparent, up to 50 instances of your function can be run at the same time.

Secrets

Secrets are an extra-secure type of environment variable. They are environment variables that are injected into your function and stored securely, but not displayed in the console after initial validation.

Serverless

Serverless allows you to deploy your Functions (FaaS) and Containerized Applications (CaaS) in a managed infrastructure. Scaleway ensures the deployment, availability, and scalability of all your projects.

Serverless Framework

Serverless.com (Serverless Framework) is a tool that enables the deployment of serverless applications without having to manage Serverless Function’s API call. Just write your configuration in a YAML and deploy, it handles everything.

Serverless Function

Serverless Functions are serverless, fully managed compute services that allow you to run small, stateless code snippets or functions in response to HTTP requests or events.

These functions automatically scale based on demand and are designed to be lightweight, event-driven, and easily deployable, eliminating the need to worry about infrastructure management. Functions is built on top of Serverless Containers, meaning you can run your functions packaged in containers and have them scale efficiently.

Serverless Job

Serverless Jobs are similar to Serverless Functions but are better suited for running longer workloads. See the comparaison between Serverless products for more information.

Stateless

Refers to a system or application that does not maintain any persistent state between executions. In a stateless environment, each request or operation is independent, and no information is retained from previous interactions.

This means that each request is treated as a new and isolated event, and there is no need for the system to remember previous states or data once a task is completed. Statelessness is commonly used in serverless architectures where each function execution is independent of others.

To store data you can use Scaleway Object Storage, Scaleway Managed Databases, and Scaleway Serverless Databases.

Status

A Serverless Function can have the following statuses:

  • Ready: your Serverless Function is operational to serve requests.
  • Pending: your resource is under deployment.
  • Error: something went wrong during the deployment process or build of the source code to image. Check our troubleshooting documentation to solve the issue.

Timeout

The timeout is the maximum length of time your handler can spend processing a request before being stopped. This value must be in the range 10s to 900s.

Trigger

In a serverless architecture, a function is not running constantly, but is rather triggered by an event.

A trigger is a mechanism that connects the function to an event source and enables the function to execute automatically in response to specific events.

Triggers can take many forms, such as HTTP requests, messages from a queue or a stream, CRON schedules, etc.

vCPU-s

Unit used to measure the resource consumption of a container. It reflects the amount of vCPU used over time.

Was this page helpful?
API DocsScaleway consoleDedibox consoleScaleway LearningScaleway.comPricingBlogCareers
© 2023-2025 – Scaleway