Claude Opus 4.6 is now available on the DigitalOcean Gradient™ AI Platform via Serverless Inference—giving teams access to Anthropic’s most capable model on a platform built to run inference reliably at scale.
With massive 1M-token context, adaptive reasoning, and advanced agentic coding, Claude Opus 4.6 enables teams to analyze huge datasets, refactor entire codebases, and generate high-quality outputs in a single pass. It’s also optimized for everyday knowledge work, including reports, spreadsheets, and presentations.
Agentic coding & software development: Plan, debug, and iterate across large codebases; perform root cause analysis; handle multilingual coding and cybersecurity tasks.
Knowledge work & research: Analyze financial data, run research, and manage multi-step tasks in documents, spreadsheets, and presentations.
Agentic automation: Coordinate multiple AI agents for parallel, read-heavy, or long-running tasks; summarize large contexts and make adaptive reasoning decisions.
Information retrieval & long-context reasoning: Retrieve hard-to-find details across vast datasets and reason over hundreds of thousands of tokens.
Office productivity: Generate structured reports, spreadsheets, and presentation decks; ingest unstructured data and produce polished outputs in one pass.
Claude Opus 4.6 runs natively inside your existing DigitalOcean environment—alongside your applications, data, networking, and storage—so inference becomes part of your stack, not another system to integrate or operate.
There are no separate model contracts, vendor accounts, or billing surfaces to manage. Usage is billed predictably alongside your other DigitalOcean services, with inference managed by default so you can start running Opus 4.6 quickly without provisioning or tuning infrastructure.
Safe defaults are built in from the start. Opus 4.6 runs within your DigitalOcean project with security-hardened defaults, reducing exposure and operational risk as workloads scale.
The result: you can build, deploy, and scale AI applications with Opus 4.6 using App Platform, Kubernetes, Managed Databases, and storage—all in the same environment, with fewer moving parts and less overhead.
Opus 4.6 is available on DigitalOcean Serverless Inference, so there’s no infrastructure to provision or manage. Authenticate with your model access key and see a response immediately using the curl request below.
curl https://inference.do-ai.run/v1/chat/completions \
-H "Authorization: Bearer YOUR_MODEL_ACCESS_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic-claude-opus-4.6",
"messages": [
{
"role": "user",
"content": "What is the capital of France?"
}
],
"temperature": 0.7,
"max_tokens": 1000
}'