Skip to main content

Install

npm install @kyma-api/ai-sdk ai

Setup

import { kyma } from "@kyma-api/ai-sdk";

// Uses KYMA_API_KEY env var automatically
const model = kyma("qwen-3.6-plus");
Or with custom config:
import { createKyma } from "@kyma-api/ai-sdk";

const kyma = createKyma({ apiKey: "ky-your-api-key" });
const model = kyma("deepseek-v3");

Generate Text

import { generateText } from "ai";
import { kyma } from "@kyma-api/ai-sdk";

const { text } = await generateText({
  model: kyma("deepseek-v3"),
  prompt: "Explain quantum computing in one paragraph.",
});
console.log(text);

Stream Text

import { streamText } from "ai";
import { kyma } from "@kyma-api/ai-sdk";

const result = streamText({
  model: kyma("qwen-3.6-plus"),
  prompt: "Write a short story about a robot.",
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

Chat UI (Next.js)

API Route

// app/api/chat/route.ts
import { streamText } from "ai";
import { kyma } from "@kyma-api/ai-sdk";

export async function POST(req: Request) {
  const { messages } = await req.json();
  const result = streamText({
    model: kyma("qwen-3.6-plus"),
    messages,
  });
  return result.toUIMessageStreamResponse();
}

Client Component

"use client";
import { useChat } from "ai/react";

export default function Chat() {
  const { messages, input, sendMessage, status } = useChat();

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong> {m.content}
        </div>
      ))}
      <form onSubmit={(e) => { e.preventDefault(); sendMessage({ text: input }); }}>
        <input value={input} placeholder="Type a message..." />
        <button type="submit" disabled={status === "streaming"}>Send</button>
      </form>
    </div>
  );
}
See the Next.js Chatbot Starter for a complete, deployable example.

Model Aliases

Use aliases instead of full model IDs:
kyma("best")         // → qwen-3.6-plus
kyma("code")         // → qwen-3-coder
kyma("fast")         // → qwen-3-32b
kyma("reasoning")    // → deepseek-r1
kyma("vision")       // → gemma-4-31b
kyma("agent")        // → kimi-k2.5
kyma("long-context") // → gemini-2.5-flash
See all aliases.

Models

Any active Kyma model in /v1/models works with the AI SDK:
kyma("deepseek-v3")       // Best value
kyma("deepseek-r1")       // Reasoning
kyma("gemini-2.5-flash")  // 1M context
kyma("qwen-3-32b")        // Fast coding
kyma("llama-3.3-70b")     // All-around