MCP Integration

Other AI agents

Cursor, Copilot, Codex, Gemini, and any tool without native MCP.

Two modes

For non-Claude agents, larkx supports two integration patterns:

  1. Native MCP, if the agent supports MCP (Cursor, Continue, Cline)
  2. Instruction file + CLI context, the agent reads larkx context output instead of calling tools (Copilot, ChatGPT, Codex, Gemini)

Cursor

Cursor supports MCP natively. Add larkx in Settings → MCP:

cursor mcp config
{
  "mcpServers": {
    "larkx": {
      "command": "larkx",
      "args": ["mcp"]
    }
  }
}

Then run larkx init in your project to create .cursorrules.

GitHub Copilot

Copilot doesn't expose tool-calling for arbitrary MCP servers, so it uses the instruction file pattern. larkx init creates .github/copilot-instructions.md with a directive to run larkx context at the start of any task.

OpenAI Codex

Codex reads AGENTS.md. Run larkx init and select OpenAI Codex. The generated AGENTS.md tells Codex to fetch larkx context before opening files.

Gemini CLI

Gemini reads GEMINI.md. Same flow, larkx init creates the file with the right instructions.

Universal fallback

For any other tool, just pipe context output into your conversation:

bash
larkx context > project-context.md
# then paste into ChatGPT / Claude.ai / wherever
Save tokens with scopingUse --folder src/auth to limit context to a subtree, or --level 1 for just file paths. Both work in any agent that can read text.