AI Coding Agents

Run OpenCode agents with any inference provider

OpenCode is an open-source AI coding agent from SST that works with multiple inference providers. With SAM, you can run OpenCode on your cloud VMs using Scaleway, Google Vertex, Anthropic, or any OpenAI-compatible API as the backend.

Why use OpenCode with SAM

Provider Flexibility

Use any OpenAI-compatible inference provider — Scaleway, Google Vertex, Anthropic, or your own endpoint.

Open Source

OpenCode is fully open source (from SST), so you can audit, modify, and extend the agent to fit your needs.

Lightweight Footprint

Minimal resource requirements mean you can run more OpenCode instances on smaller VMs.

Scaleway Native

First-class integration with Scaleway's inference API — a natural pairing with SAM's Scaleway cloud provider support.


Get started in four steps

Step 1

Configure Provider

Set your inference provider credentials — Scaleway API key, Google Vertex config, or a custom endpoint.

Step 2

Select OpenCode

Choose OpenCode as the agent for your project.

Step 3

Submit a Task

Describe your coding task. SAM provisions a VM and launches OpenCode.

Step 4

Agent Delivers

OpenCode completes the work, and SAM commits and pushes the changes.


What you can build

Self-hosted AI stack

Pair OpenCode with a self-hosted inference endpoint for a fully private AI coding setup — no data leaves your infrastructure.

Scaleway-native development

Use Scaleway for both compute (VMs) and inference (Scaleway AI) — a unified European cloud stack.

Custom model experimentation

Test different models and providers by swapping OpenCode's inference backend without changing your workflow.


Frequently asked questions

What inference providers work with OpenCode?

OpenCode supports Scaleway, Google Vertex AI, Anthropic, and any OpenAI-compatible endpoint. Configure the provider in your project settings.

Is OpenCode the same as OpenAI Codex?

No — OpenCode is a separate open-source project from SST. It's a different agent that happens to support multiple AI providers including (but not limited to) OpenAI-compatible APIs.

Can I use my own fine-tuned models?

Yes — if your model is served via an OpenAI-compatible API endpoint, OpenCode can use it. Point the inference URL to your custom endpoint.


Start running OpenCode on your infrastructure

Self-host on Cloudflare's free tier. Bring your own cloud. Your agents, your code.