SeaLink

Client Compatibility

The tools, frameworks, and SDKs below are verified against SeaLink. Each entry lists supported endpoints, model capabilities, known limits, and sample code. New tools are added continuously.

OpenAI SDK

official

Full chat completions, streaming, tool calling. The default integration path — set base_url to SeaLink and use your existing OpenAI SDK code.

Endpoint
/v1/chat/completions, /v1/embeddings, /v1/models
Streaming
SSE streaming supported on all chat models
Tool use
Function calling on GPT, Claude, Qwen, GLM, Kimi
Limits
Vision/file input depends on model; not all models support function calling. Verify per-model capabilities on /models.

Example

from openai import OpenAI

client = OpenAI(
    base_url="https://api.sealink.asia/v1",
    api_key="<your-sealink-key>",
)

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello"}],
)
print(response.choices[0].message.content)

Anthropic SDK

official

Direct Messages API endpoint. Works with the Anthropic Python/TypeScript SDK — change base_url only.

Endpoint
/anthropic/v1/messages
Streaming
SSE streaming supported (server-sent events)
Tool use
Tool use on Claude models (Opus, Sonnet, Haiku)
Limits
Anthropic-specific features (citations, prompt caching) not yet available. Available on Claude models only.

Example

import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic({
  baseURL: "https://api.sealink.asia/anthropic/v1",
  apiKey: "<your-sealink-key>",
});

const msg = await client.messages.create({
  model: "claude-sonnet-4-6",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello" }],
});
console.log(msg.content[0].text);

Cursor

ai-ide

Set SeaLink as your OpenAI-compatible provider in Cursor settings. Use any chat model for inline edits and Copilot-style completion.

Endpoint
/v1/chat/completions (OpenAI-compatible)
Streaming
Streaming supported — live completions in editor
Tool use
Not required for Cursor — editor handles tool use internally
Limits
Some models may be slower than native IDE models due to network latency. Cursor's CTRL+K uses a single turn; full chat uses multi-turn.

Example

# Cursor Settings → Models → OpenAI API Key
# Set:
#   Base URL: https://api.sealink.asia/v1
#   API Key:  <your-sealink-key>
#
# Then select a model from the model picker.

Claude Code

ai-ide

Connect Claude Code to SeaLink as an Anthropic-compatible provider. Use Opus/Sonnet/Haiku for agentic coding sessions.

Endpoint
/anthropic/v1/messages
Streaming
Streaming supported for real-time agent output
Tool use
Full tool use supported (Claude models only)
Limits
Requires Claude model. Non-Claude models available via /v1/chat/completions but won't support Anthropic tool use protocol.

Example

# .claude/settings.json
{
  "apiKeyHelper": "echo $ANTHROPIC_API_KEY",
  "baseURL": "https://api.sealink.asia/anthropic/v1"
}

Dify

low-code

Add SeaLink as an OpenAI-compatible model provider in Dify. All chat models available for Dify workflows, chatbots, and RAG pipelines.

Endpoint
/v1/chat/completions (OpenAI-compatible)
Streaming
Streaming supported in Dify chatbot and workflow nodes
Tool use
Dify manages tool use internally; model function calling not required
Limits
Model context window must match Dify's prompt size. Check /models for per-model context limits.

Example

# Dify → Settings → Model Provider → OpenAI-API-compatible
# Add provider:
#   API Base:  https://api.sealink.asia/v1
#   API Key:   <your-sealink-key>
#
# All chat models appear in the model selector.

n8n

low-code

Use SeaLink with n8n's OpenAI node. Any chat model can power automation workflows — set base URL and key in the OpenAI credentials.

Endpoint
/v1/chat/completions (via OpenAI node)
Streaming
Supported in n8n OpenAI node streaming mode
Tool use
n8n handles tools via its own node system; model tool calling optional
Limits
n8n OpenAI node feature set varies by version. Some advanced parameters (logprobs, n) may not carry through.

Example

# n8n → Credentials → OpenAI API
# Set:
#   API Base URL:  https://api.sealink.asia/v1
#   API Key:       <your-sealink-key>
#
# Use the "OpenAI" node in any workflow.

Don't see your tool?

SeaLink is compatible with the OpenAI and Anthropic protocols. Most tools that support a custom base URL work out of the box. Tell us what tool you use and we'll verify it with real traffic and update this page.