Kyma supports the Anthropic Messages API format. If your app uses the Anthropic SDK, just change the base URL and API key.
Python
import anthropic
client = anthropic.Anthropic(
base_url="https://kymaapi.com",
api_key="kyma-your-api-key",
)
message = client.messages.create(
model="llama-3.3-70b",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello! What can you do?"}
],
)
print(message.content[0].text)
JavaScript / TypeScript
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
baseURL: "https://kymaapi.com",
apiKey: "kyma-your-api-key",
});
const message = await client.messages.create({
model: "qwen-3-32b",
max_tokens: 1024,
messages: [
{ role: "user", content: "Write a haiku about coding" },
],
});
console.log(message.content[0].text);
With system prompt
message = client.messages.create(
model="llama-3.3-70b",
max_tokens=1024,
system="You are a helpful coding assistant.",
messages=[
{"role": "user", "content": "Write a Python fibonacci function"}
],
)
Streaming
with client.messages.stream(
model="llama-3.3-70b",
max_tokens=1024,
messages=[{"role": "user", "content": "Tell me a story"}],
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)
Switching from Claude to Kyma
If you’re already using Claude, the only changes needed are:
# Before (Claude)
client = anthropic.Anthropic() # uses ANTHROPIC_API_KEY
# After (Kyma)
client = anthropic.Anthropic(
base_url="https://kymaapi.com",
api_key="kyma-your-key",
)
Then change the model name:
| Claude Model | Kyma Equivalent | Notes |
|---|
| claude-sonnet-4 | llama-3.3-70b | Best all-rounder |
| claude-haiku-3.5 | llama-3.1-8b | Fast, lightweight |
| claude-opus-4 | qwen-3-235b-cerebras | Highest quality |
Kyma models are open source and free. They’re different from Claude but work great for most tasks.