Other AI agents
Cursor, Copilot, Codex, Gemini, and any tool without native MCP.
Two modes
For non-Claude agents, larkx supports two integration patterns:
- Native MCP, if the agent supports MCP (Cursor, Continue, Cline)
- Instruction file + CLI context, the agent reads
larkx contextoutput instead of calling tools (Copilot, ChatGPT, Codex, Gemini)
Cursor
Cursor supports MCP natively. Add larkx in Settings → MCP:
{
"mcpServers": {
"larkx": {
"command": "larkx",
"args": ["mcp"]
}
}
}Then run larkx init in your project to create .cursorrules.
GitHub Copilot
Copilot doesn't expose tool-calling for arbitrary MCP servers, so it uses the instruction file pattern. larkx init creates .github/copilot-instructions.md with a directive to run larkx context at the start of any task.
OpenAI Codex
Codex reads AGENTS.md. Run larkx init and select OpenAI Codex. The generated AGENTS.md tells Codex to fetch larkx context before opening files.
Gemini CLI
Gemini reads GEMINI.md. Same flow, larkx init creates the file with the right instructions.
Universal fallback
For any other tool, just pipe context output into your conversation:
larkx context > project-context.md
# then paste into ChatGPT / Claude.ai / wherever--folder src/auth to limit context to a subtree, or --level 1 for just file paths. Both work in any agent that can read text.