larkx indexes any project into a compact graph that Claude Code, Cursor, GitHub Copilot, and any MCP-compatible AI client can read in seconds, typically using around 60-85% fewer tokens than reading source files, with savings up to ~90% on focused tasks.
larkx stats on your project for indexed estimates, or open the calculator.Four detail levels: paths, symbols, signatures, AI summaries. The AI picks the cheapest level per task.
Restrict any query to a subtree. Token cost drops roughly in proportion to the folder size, often very large savings on focused tasks.
One-line summaries per file via Claude Haiku. Cached forever, pennies to generate.
Works with Claude Code, Cursor, Continue, Cline, and any Model Context Protocol client.
BFS from entry points. Auto-detects Next.js routes, NestJS modules, and more.
SHA-256 file hashing. Reindex is near-instant after the initial build.
Browser explorer with Module view and Graph view. Zoom, search, VS Code deep links.
Excludes .env, *.key, *.pem, credentials* by default. Never enters AI context.
larkx init asks two questions, sets up MCP, agent files, and hooks.
Adjust the sliders to match your project.
| Level | Includes | Tokens | Savings |
|---|---|---|---|
| L1Paths | Just file paths + language | ~1.6K | −99% |
| L2Symbols | + function & class names + imports | ~16K | −87% |
| L3Signatures | + full function signatures | ~30K | −75% |
| L4Summaries | + one-line AI summary per file | ~50K | −58% |
| Reading every file directly | ~120K | baseline | |
larkx stats in your project for indexed estimates that better match your codebase.Install, init, index, done. No accounts. No telemetry. Runs entirely on your machine.
Read the setup guide