← Docs
Function calling
OpenAI-standard tool calling, identical syntax across every SeaLink model. One code path for Claude / GPT / Qwen / Doubao / Kimi / DeepSeek / GLM.
Example
Python
from openai import OpenAIclient = OpenAI(base_url="https://api.sealink.asia/v1",api_key="<your-sealink-key>",)tools = [{"type": "function","function": {"name": "get_weather","description": "Get the current weather for a city","parameters": {"type": "object","properties": {"city": {"type": "string", "description": "City name"},"unit": {"type": "string", "enum": ["c", "f"]},},"required": ["city"],},},}]resp = client.chat.completions.create(model="claude-sonnet-4-6", # or gpt-4o, qwen3-max, doubao-1-5-promessages=[{"role": "user", "content": "Weather in Bangkok in C?"}],tools=tools,tool_choice="auto",)call = resp.choices[0].message.tool_calls[0]print(call.function.name, call.function.arguments)# get_weather {"city": "Bangkok", "unit": "c"}
Node.js
import OpenAI from "openai";const client = new OpenAI({baseURL: "https://api.sealink.asia/v1",apiKey: process.env.SEALINK_API_KEY,});const tools = [{type: "function" as const,function: {name: "search_docs",description: "Search internal documentation",parameters: {type: "object",properties: {query: { type: "string" },top_k: { type: "integer", default: 3 },},required: ["query"],},},}];const resp = await client.chat.completions.create({model: "qwen3-max",messages: [{ role: "user", content: "How do I refund a customer?" }],tools,tool_choice: "auto",});console.log(resp.choices[0].message.tool_calls);
Model support matrix
| Model | Level | Note |
|---|---|---|
| claude-sonnet-4-6 | ✅ Full | Native Anthropic tool use |
| claude-haiku-4-5 | ✅ Full | Native |
| claude-opus-4-7 | ✅ Full | Native |
| gpt-4o | ✅ Full | Original spec |
| gpt-4o-mini | ✅ Full | Original spec |
| o3-mini | ✅ Full | Reasoning + tools |
| qwen3-max | ✅ Full | OpenAI-compatible |
| qwen3-plus | ✅ Full | OpenAI-compatible |
| doubao-1-5-pro | ✅ Full | OpenAI-compatible |
| kimi-k2 | ✅ Full | OpenAI-compatible |
| deepseek-v4-flash | ✅ Full | OpenAI-compatible |
| glm-4-6 | ✅ Full | OpenAI-compatible |
| qwen3-turbo | ⚠️ Partial | Single tool reliable; multi-tool flaky |
| doubao-1-5-lite | ⚠️ Partial | Best-effort |
| glm-4-flash | ⚠️ Partial | Best-effort (free tier) |
Tip
- SeaLink translates OpenAI tool syntax to each upstream's native format (Anthropic / Google / Alibaba). Your code is provider-agnostic.
- More specific tool / parameter descriptions = better tool selection accuracy.
- For high-throughput, prefer claude-haiku-4-5 / gpt-4o-mini / qwen3-turbo.