Explore all Scaleway products in the console and select the right product for your use case.
Some products are not listed but for example, on specific use cases Secret Manager can help you to store informations that requires versioning.
Serverless computing is a cloud execution model where the cloud provider dynamically manages the allocation of compute resources. Unlike traditional hosting models, you do not need to provision, scale, or maintain servers. Instead, you focus solely on writing and deploying your code, and the infrastructure scales automatically to meet demand.
These services allow you to build highly scalable, event-driven, and pay-as-you-go solutions. Serverless Containers and Functions help you create applications and microservices without worrying about server management, while Serverless Jobs lets you run large-scale, parallel batch-processing tasks efficiently. This can lead to faster development cycles, reduced operational overhead, and cost savings.
With serverless, you only pay for the computing resources you use. There are no upfront provisioning costs or paying for idle capacity. When your application traffic is low, the cost scales down, and when traffic spikes, the platform automatically scales up, ensuring you never overpay for unused resources.
Scaling in Serverless Containers and Serverless Functions is handled automatically by the platform. When demand increases - more requests or events - the platform spins up additional instances to handle the load. When demand decreases, instances spin down. This ensures optimal performance without manual intervention.
No, deploying a new version of your Serverless Function generates a rolling update. This means that a new version of the service is gradually rolled out to your users without downtime. Here is how it works:
This process ensures a seamless update experience, minimizing user disruption during deployments. If needed, you can also manage traffic splitting between versions during the update process, allowing you to test new versions with a subset of traffic before fully migrating to them.
Yes, Serverless Functions resources can be changed at any time without causing downtime, see the previous question for full details.
Integration is straightforward. Serverless Functions and Containers can be triggered by events from Queues and Topics and Events, and can easily communicate with services like Managed Databases or Serverless databases. Serverless Jobs can pull data from Object Storage, or output processed results into a database. With managed connectors, APIs, and built-in integrations, linking to the broader Scaleway ecosystem is seamless.
Serverless Functions is billed on a pay-as-you-go basis. Three components are taken into account:
Monthly request number: each time your function is invoked we increase a counter.
Resource consumption: this component is obtained by multiplying the memory tiers chosen by the duration of each function invocation.
Resources provision: in order to mitigate cold start, users can choose to keep some function instances warm (by filing the minimum scale value). We then charge for the provisioned resources similarly to the Resources consumption component.
The scheme below illustrates our billing model:
Monthly requests: €0.15 per million requests, and we offer 1M free requests per account per month.
Resources consumption: €1.20 per 100 000 GB-s, and we provide 400 000 GB-s free tiers per account and per month.
Memory provisioned | Cost per second |
---|---|
128 MB | €0.0000015 |
256 MB | €0.0000030 |
512 MB | €0.0000060 |
1024 MB | €0.0000120 |
2048 MB | €0.0000240 |
3072 MB | €0.0000360 |
4096 MB | €0.0000480 |
Memory provisioned | Cost per second |
---|---|
128 MB | €0.00000045 |
256 MB | €0.0000009 |
512 MB | €0.0000018 |
1024 MB | €0.0000036 |
2048 MB | €0.0000072 |
3072 MB | €0.0000108 |
4096 MB | €0.0000144 |
Criteria | Value |
---|---|
Number of requests | 30 000 000 |
Average request duration | 1 s |
Allocated resources (memory) | 128 MB |
Free tier | Yes |
Provision/minimum instances | 0 |
Resources consumption
Requests:
Total monthly cost: €44.55
Criteria | Value |
---|---|
Number of requests | 30 000 000 |
Average request duration | 1 s |
Allocated resources (memory) | 128 MB |
Free tier | Yes |
Provision/minimum instances | 1 |
Resources consumption
Provisioned functions consumption:
Requests:
Total monthly cost: €45.72
Insufficient vCPU and RAM resources can lead to functions going to error status. Make sure to provision enough resources for your function.
We recommend you set high values, use metrics to monitor the resource usage of your function, then adjust the value accordingly.
Optimize the startup: Cold-start can be affected by a loading a large number of dependencies and opening lot of resources at startup. Ensure that your code avoid heavy computations or long-running initialization at statup and optimize the number of loaded libraries.
Keep your function warm: You can use CRON triggers at certain intervals to keep your function warm or set the min-scale parameter to one when required.
Increase resources: Adding more vCPU and RAM can help to significantly reduce the cold-starts of your function.
Use sandbox v2: we recommend to use sandbox v2 (advanced settings) to reduce cold start.
Refer to our dedicated page about Serverless Functions limitations and configuration restrictions for more information.
Serverless Functions enables you to deploy functions using popular languages: Go
, Node
, Python
, PHP
, Rust
…
Refer to our dedicated page about Serverless Functions Runtimes Lifecycle
Some Serverless runtimes (ex: Go
, Rust
) will compile your code in order to make your function executable.
Compilation can fail if errors are present in the code, for example syntax errors and missing libraries.
Build errors are sent to the Observability platform on Scaleway Cockpit.
In the Serverless Functions Logs
dashboard, you will then be able to read information about your log build outputs, if errors occurred during compilation.
Serverless Functions does not yet support Private Networks. However, you can use the Scaleway IP ranges defined at https://www.scaleway.com/en/peering/ on Managed Databases and other products that allow IP filtering.
Serverless Functions use cases are wide so several ways to deploy functions are available.
Scaleway provides libraries to run your functions locally, for debugging, profiling, and testing purposes. Refer to the dedicated documentation for more information.
Check out our serverless-examples repository for real world projects.
There are no constraints when changing a function runtime, you simply need to choose the runtime version you want. Upgrading a runtime is highly recommended in case of deprecation, and for runtimes that have reached end-of-support or end-of-life. See the functions runtimes lifecycle documentation for more information.
Scaleway Serverless Functions does not currently support Scaleway VPC or Private Networks, though this feature is under development.
To add network restrictions on your resource, consult the list of prefixes used at Scaleway. Note that Serverless resources do not have dedicated or predictable IP addresses.
Scaleway Serverless Functions do not currently support attaching Block Storage. These functions are designed to be stateless, meaning they do not retain data between invocations. For persistent storage, we recommend using external solutions like Scaleway Object Storage.
Currently, a new function instance will always start after each deployment, even if there is no traffic and the minimum scale is set to 0. This behavior is not configurable at this time.
Serverless resources are by default stateless, local storage is ephemeral.
For some use cases, such as saving analysis results, exporting data etc., it can be important to save data. Serverless resources can be connected to other resources from the Scaleway ecosystem for this purpose:
Explore all Scaleway products in the console and select the right product for your use case.
Some products are not listed but for example, on specific use cases Secret Manager can help you to store informations that requires versioning.
You cannot use Serverless Functions with Edge Services because there are no native integrations between the two products yet.
By design, it is not possible to guarantee static IPs on Serverless compute resources.