NavigationContentFooter
Jump toSuggest an edit
Was this page helpful?

Integrating Scaleway Generative APIs with popular AI tools

Reviewed on 18 February 2025Published on 18 February 2025

Scaleway’s Generative APIs are designed to provide easy access to the latest AI models and techniques. Our APIs are built on top of a robust infrastructure that ensures scalability, reliability, and security. With our APIs, you can integrate AI capabilities into your applications, such as text generation, image classification, and more.

Comparison of AI tools and librariesLink to this anchor

The following table compares AI tools and libraries supported by Scaleway’s Generative APIs:

Tool/LibraryDescriptionUse casesIntegration effort
OpenAIPopular AI library for natural language processingText generation, language translation, text summarizationLow
LangChainLibrary for building AI applicationsInference, embeddings, document indexing and retrievalMedium
LlamaIndexLibrary for indexing and retrieving documents using AI modelsDocument indexing and retrieval, question answeringMedium
Continue DevLibrary for AI-powered coding assistanceCode completion, code reviewLow
Transformers (Hugging Face)Library for pre-trained models for natural language processingText generation, language translation, text summarizationMedium
cURL/PythonDirect API clients for custom integrationsCustom applications, data processingHigh
Note

The integration effort is subjective and may vary depending on the specific use case and requirements.

OpenAI-compatible librariesLink to this anchor

Scaleway Generative APIs follow OpenAI’s API structure, making integration straightforward. To get started, you’ll need to install the OpenAI library and set up your API key.

ConfigurationLink to this anchor

To use the OpenAI library with Scaleway’s Generative APIs, you’ll need to set the API key and base URL in your OpenAI-compatible client:

import openai
openai.api_key = "<API secret key>"
openai.api_base = "https://api.scaleway.ai/v1"
response = openai.ChatCompletion.create(
model="llama-3.1-8b-instruct",
messages=[{"role": "user", "content": "Tell me a joke about AI"}]
)
print(response["choices"][0]["message"]["content"])
Tip

Make sure to replace <API secret key> with your actual API key.

Using OpenAI for text generationLink to this anchor

To use OpenAI for text generation, you can create a ChatCompletion object and call the create method:

response = openai.ChatCompletion.create(
model="llama-3.1-8b-instruct",
messages=[{"role": "user", "content": "Tell me a joke about AI"}]
)
print(response["choices"][0]["message"]["content"])

LangChain (RAG & LLM applications)Link to this anchor

LangChain is a popular library for building AI applications. Scaleway’s Generative APIs support LangChain for both inference and embeddings.

ConfigurationLink to this anchor

To use LangChain with Scaleway’s Generative APIs, you’ll need to install the required dependencies:

pip install langchain langchain_openai langchain_postgres psycopg2

Next, set up the API connection:

from langchain_openai import OpenAIEmbeddings, ChatOpenAI
import os
os.environ["OPENAI_API_KEY"] = "<API secret key>"
os.environ["OPENAI_API_BASE"] = "https://api.scaleway.ai/v1"
llm = ChatOpenAI(model="llama-3.1-8b-instruct")
embeddings = OpenAIEmbeddings(model="bge-multilingual-gemma2")
Tip

Make sure to replace <API secret key> with your actual API key.

Using LangChain for inferenceLink to this anchor

To use LangChain for inference, you can create a ChatOpenAI object and call the ask method:

response = llm.ask("What is the capital of France?")
print(response)

Using LangChain for embeddingsLink to this anchor

To use LangChain for embeddings, you can create an OpenAIEmbeddings object and call the compute_embeddings method:

embeddings = OpenAIEmbeddings(model="bge-multilingual-gemma2")
text = "This is an example sentence."
embedding = embeddings.compute_embeddings(text)
print(embedding)

LlamaIndex (document indexing & retrieval)Link to this anchor

LlamaIndex is a library for indexing and retrieving documents using AI models. Scaleway’s Generative APIs support LlamaIndex for document indexing and retrieval.

ConfigurationLink to this anchor

To use LlamaIndex with Scaleway’s Generative APIs, you’ll need to install the required dependencies:

pip install llama-index

Next, set up the embedding model:

from llama_index.embeddings.openai import OpenAIEmbedding
embed_model = OpenAIEmbedding(
api_key="<API secret key>",
api_base="https://api.scaleway.ai/v1",
model="bge-multilingual-gemma2"
)
Tip

Make sure to replace <API secret key> with your actual API key.

Indexing documentsLink to this anchor

To index documents using LlamaIndex, you’ll need to create a VectorStoreIndex object and call the add_documents method:

from llama_index import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents, embed_model=embed_model)

Retrieving documentsLink to this anchor

To retrieve documents using LlamaIndex, you can call the query method on the VectorStoreIndex object:

query_engine = index.as_query_engine()
response = query_engine.query("Summarize this document")
print(response)

Continue Dev (AI coding assistance)Link to this anchor

Continue Dev is a library that provides AI-powered coding assistance. Scaleway’s Generative APIs support Continue Dev for code completion and more.

Tip

Refer our dedicated documentation for

  • Integrating Continue Dev with Visual Studio Code
  • Integrating Continue Dev with IntelliJ IDEA

ConfigurationLink to this anchor

To use Continue Dev with Scaleway’s Generative APIs, you’ll need to modify the continue.json file to add Scaleway’s API:

{
"models": [
{
"title": "Qwen2.5-Coder-32B-Instruct",
"provider": "scaleway",
"model": "qwen2.5-coder-32b-instruct",
"apiKey": "<API secret key>"
}
],
"embeddingsProvider": {
"provider": "scaleway",
"model": "bge-multilingual-gemma2",
"apiKey": "<API secret key>"
}
}
Tip

Make sure to replace <API secret key> with your actual API key.

Transformers (Hugging Face integration)Link to this anchor

Hugging Face’s transformers library provides a range of pre-trained models for natural language processing. Scaleway’s Generative APIs support Hugging Face integration for text generation and more.

ConfigurationLink to this anchor

To use Hugging Face with Scaleway’s Generative APIs, you’ll need to install the transformers library and set up your API key:

from transformers import pipeline
generator = pipeline(
"text-generation",
model="llama-3.1-8b-instruct",
tokenizer="meta-llama/Llama-3-8b",
api_base="https://api.scaleway.ai/v1",
api_key="<API secret key>"
)
Tip

Make sure to replace <API secret key> with your actual API key.

Using Hugging Face for text generationLink to this anchor

To use Hugging Face for text generation, you can call the generator function:

print(generator("Write a short poem about the ocean"))

API clients and custom integrationsLink to this anchor

You can interact with Scaleway’s Generative APIs directly using any HTTP client.

cURL exampleLink to this anchor

To use cURL with Scaleway’s Generative APIs, you can use the following command:

curl https://api.scaleway.ai/v1/chat/completions \
-H "Authorization: Bearer <API secret key>" \
-H "Content-Type: application/json" \
-d '{
"model": "llama-3.1-8b-instruct",
"messages": [{"role": "user", "content": "What is quantum computing?"}]
}'
Tip

Make sure to replace <API secret key> with your actual API key.

Python exampleLink to this anchor

To use Python with Scaleway’s Generative APIs, you can use the following code:

import requests
headers = {
"Authorization": "Bearer <API secret key>",
"Content-Type": "application/json"
}
data = {
"model": "llama-3.1-8b-instruct",
"messages": [{"role": "user", "content": "Explain black holes"}]
}
response = requests.post("https://api.scaleway.ai/v1/chat/completions", json=data, headers=headers)
print(response.json()["choices"][0]["message"]["content"])
Tip

Make sure to replace <API secret key> with your actual API key.

Was this page helpful?
API DocsScaleway consoleDedibox consoleScaleway LearningScaleway.comPricingBlogCareers
© 2023-2025 – Scaleway