Quickstart
Learn how to create, connect to, and delete a Managed Inference endpoint in a few steps.
View QuickstartEffortlessly deploy AI models on a sovereign infrastructure, manage and scale inference with full data privacy. Start now with a simple interface for creating dedicated endpoints.
Managed Inference QuickstartLearn how to create, connect to, and delete a Managed Inference endpoint in a few steps.
View QuickstartCore concepts that give you a better understanding of Scaleway Managed Inference.
View ConceptsCheck our guides about creating and managing Managed Inference endpoints.
View How-tosGuides to help you choose a Managed Inference endpoint, understand pricing and advanced configuration.
View additional contentLearn how to create and manage your Scaleway Managed Inference endpoints through the API.
Go to Managed Inference APIDevstral Small 2505 is now available for deployment on Managed Inference.
Devstral is a fine-tune of Mistral Small 3.1, optimized for agentic software engineering tasks. Refer to the official documentation for more information and try Devstral within your IDE.
Deploy latest AI models from our model catalog to benefit from guaranteed performance and strong security within your VPC.
Custom models can now be deployed on Managed Inference.
Try it now by providing a Hugging Face URL from a compatible model.
Visit our Help Center and find the answers to your most frequent questions.
Visit Help CenterYour opinion helps us make a better documentation.