Skip to main content

How it works

Kyma forwards your tools and tool_choice parameters directly to the model. Same format as OpenAI. No changes needed.

Example

from openai import OpenAI

client = OpenAI(
    base_url="https://kymaapi.com/v1",
    api_key="ky-your-api-key"
)

response = client.chat.completions.create(
    model="llama-3.3-70b",
    messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
    tools=[{
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get current weather for a location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {"type": "string"}
                },
                "required": ["location"]
            }
        }
    }],
    tool_choice="auto"
)

message = response.choices[0].message
if message.tool_calls:
    for call in message.tool_calls:
        print(f"Call: {call.function.name}({call.function.arguments})")
The response looks like this:
{
  "choices": [{
    "message": {
      "role": "assistant",
      "tool_calls": [{
        "id": "call_abc123",
        "type": "function",
        "function": {
          "name": "get_weather",
          "arguments": "{\"location\": \"Tokyo\"}"
        }
      }]
    },
    "finish_reason": "tool_calls"
  }]
}

Which models support tool calling?

ModelTool Calling
Llama 3.3 70B
Qwen 3 32B
Qwen 3 Coder
Kimi K2.5✅ (best for agentic use)
GPT-OSS 120B
Gemini 2.5 Flash
Gemini 3 Flash
Gemma 4 31B
DeepSeek V3
DeepSeek R1
MiniMax M2.5
Most active models on Kyma support tool calling. Check /v1/models for the current supports_tools field.

Multi-turn tool use

After receiving a tool call, send the result back:
messages = [
    {"role": "user", "content": "What's the weather in Tokyo?"},
    {"role": "assistant", "tool_calls": [...]},  # from previous response
    {"role": "tool", "tool_call_id": "call_abc123", "content": "22°C, sunny"}
]

response = client.chat.completions.create(
    model="llama-3.3-70b",
    messages=messages,
    tools=tools
)
# Model now responds using the weather data

Works with agent frameworks

Kyma works with LangChain, CrewAI, AutoGen, and any framework that uses OpenAI-style tool calling.
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://kymaapi.com/v1",
    api_key="ky-your-api-key",
    model="kimi-k2.5"  # best for agentic tasks
)

# Use with LangChain tools, agents, chains as normal