Docs

Integrations

Integrate cadin into your own application. These examples show how to connect the MCP server to your system so your LLM can discover and call all 4 legal research tools automatically.

Setup

  1. 1

    Install an LLM SDK that supports remote MCP servers.

  2. 2

    Create an API key in your cadin dashboard.

  3. 3

    Include cadin in your tools and start calling.

Usage

Anthropic TypeScript SDK
Installnpm
npm install @anthropic-ai/sdk
ExampleTypeScript
import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic();
const response = await client.beta.messages.create({
  model: "claude-sonnet-4-6",
  max_tokens: 4096,
  mcp_servers: [
    {
      type: "url",
      url: "https://mcp.cadin.ai/mcp",
      name: "cadin",
      authorization_token: "YOUR_CADIN_API_KEY",
    },
  ],
  tools: [{ type: "mcp_toolset", mcp_server_name: "cadin" }],
  messages: [
    { role: "user", content: "Find EU regulations about the right to erasure" },
  ],
  betas: ["mcp-client-2025-11-20"],
});

console.log(response);

Authentication

HeaderFormatNotes
AuthorizationApiKey sk-...Recommended for SDK usage
x-api-keysk-...Alternative header
AuthorizationBearer eyJ...OAuth tokens (for browser-based flows)

Do not send API keys as Authorization: Bearer sk-... , the server rejects that format.

Good to know

Sessions are stored in memory. A deploy or restart invalidates every session, the SDK will need to reconnect. The server returns a 400 with invalid_session_id when this happens.

Tool responses are plain text formatted for LLM consumption, not JSON. Search results include pagination hints embedded in the text (e.g., "Use offset: 20 to see more").

The MCP SDKs also support Java, C#, Kotlin, and Swift. The connection pattern is the same, create a transport with the URL and API key header, then connect.