Quickstart
Learn how to create, connect to, and delete a Managed Inference endpoint in a few steps.
View QuickstartEffortlessly deploy AI models on a sovereign infrastructure, manage and scale inference with full data privacy. Start now with a simple interface for creating dedicated endpoints.
Managed Inference QuickstartLearn how to create, connect to, and delete a Managed Inference endpoint in a few steps.
View QuickstartCore concepts that give you a better understanding of Scaleway Managed Inference.
View ConceptsCheck our guides about creating and managing Managed Inference endpoints.
View How-tosGuides to help you choose a Managed Inference endpoint, understand pricing and advanced configuration.
View additional contentLearn how to create and manage your Scaleway Managed Inference endpoints through the API.
Go to Managed Inference APIH100-SXM-2, H100-SXM-4 and H100-SXM-8 nodes are now available in Managed Inference.
Try them now to benefit from improved token generation speed (up to x2 on most models).
Devstral Small 2505 is now available for deployment on Managed Inference.
Devstral is a fine-tune of Mistral Small 3.1, optimized for agentic software engineering tasks. Refer to the official documentation for more information and try Devstral within your IDE.
Deploy latest AI models from our model catalog to benefit from guaranteed performance and strong security within your VPC.
Visit our Help Center and find the answers to your most frequent questions.
Visit Help Center