← Docs
LangChain integration
Point LangChain's ChatOpenAI to SeaLink. Works with Python and JS/TS LangChain.
One-click LangChain config
Config has the base URL pre-filled. Once you sign in, your real API key is auto-inlined.
Download LangChain configSteps
- 1Install langchain-openai (Python: pip install langchain-openai; Node: npm i @langchain/openai).
- 2Create a ChatOpenAI instance with SeaLink's base_url and your API key.
- 3Use any SeaLink model ID as the model name. Invoke, stream, or batch as usual.
- 4For tool-calling agents, set model_kwargs or bind tools — SeaLink passes through OpenAI-compatible tool definitions.
Configuration
Python
from langchain_openai import ChatOpenAIfrom langchain_core.messages import HumanMessagellm = ChatOpenAI(base_url="https://api.sealink.asia/v1",api_key="<your-sealink-key>",model="claude-sonnet-4-6",temperature=0.7,)resp = llm.invoke([HumanMessage(content="Explain SeaLink in one sentence.")])print(resp.content)
SeaLink is compatible with any LangChain feature that uses OpenAI-compatible chat models. For streaming, pass streaming=True. For tool calling, use bind_tools() or with_structured_output(). See /docs/compatibility for model-by-model capability support.
Having trouble? See other guides · support@sealink.asia