AI Coding Agents

Run OpenAI Codex agents in cloud workspaces

OpenAI Codex is a cloud-native coding agent that writes and executes code in a sandboxed environment. With SAM, you can run Codex on your own infrastructure — keeping your code on your VMs while leveraging OpenAI's latest models.

GPT-5GPT-4.1o3o4-mini

Why use Codex with SAM

Your Infrastructure

Codex runs on your cloud VMs instead of OpenAI's sandbox — giving you full control over compute, data residency, and costs.

OAuth Token Management

SAM manages Codex OAuth token refresh automatically — including concurrent session handling to prevent token race conditions.

Full GPT Model Access

Use the latest GPT models including GPT-5, o3, and o4-mini for different coding tasks.

Devcontainer Integration

Codex works inside your repo's devcontainer — no separate Docker setup needed.


Get started in four steps

Step 1

Connect Your Account

Add your OpenAI API key or Codex OAuth credentials in project settings.

Step 2

Choose Codex as Agent

Select OpenAI Codex from the agent dropdown when creating a project or submitting a task.

Step 3

Submit Tasks

Describe your coding task in natural language. SAM provisions a VM and runs Codex.

Step 4

Get Results

Codex completes the work, and SAM pushes changes to a branch with a pull request.


What you can build

Code generation at scale

Use Codex to generate boilerplate, API clients, or data models across multiple repositories simultaneously.

Bug fix triage

Submit multiple bug reports as tasks and let Codex investigate and fix each one in parallel.

Migration assistance

Let Codex handle framework migrations, dependency updates, or language version upgrades across your codebase.


Frequently asked questions

Do I need a ChatGPT Pro subscription?

You can use either an OpenAI API key or a ChatGPT Pro/Plus OAuth token. SAM supports both authentication methods.

How does SAM handle Codex token refresh?

SAM includes a centralized refresh proxy that serializes token rotation per user, preventing race conditions when multiple Codex instances run concurrently.

What models can Codex use?

Codex supports GPT-5 variants (5.4, 5.3, 5.2, 5.1), GPT-4.1, o3, and o4-mini. You can configure the model per project.


Start running Codex on your infrastructure

Self-host on Cloudflare's free tier. Bring your own cloud. Your agents, your code.