AI Summaries
One-sentence descriptions for every file — optional, cached, and token-aware.
larkx index --ai only when you need them — the wizard asks for confirmation before starting.What are AI summaries?
AI summaries are short, one-sentence descriptions of what each file does. They are generated by sending the file content to an LLM and asking it to summarize in a single sentence.
Once generated, summaries are cached in .larkx/summaries.json (gitignored). Files that already have a summary are skipped on subsequent runs.
Where summaries appear
Summaries are included in two places:
get_project_indexlevel 4 — each file line is prefixed with// summary textget_file_summary— shown at the top of the file output if available
// Handles JWT validation and password hashing for the authentication layer
src/auth/login.ts[typescript]: validateJWT@6, hashPassword@12 | +../utils/crypto,+../db/users
// Middleware that enforces authentication and rate-limits API routes
src/auth/middleware.ts[typescript]: authMiddleware@9, rateLimiter@19 | +./loginTwo ways to generate summaries
During larkx init, you choose a provider. You can change it later by editing .larkx/config.json.
| Provider | Requires | Cost |
|---|---|---|
| Local Claude | claude CLI in PATH (Claude Pro/Team subscription) | Uses your existing subscription — no extra charge |
| Anthropic API key | ANTHROPIC_API_KEY env var or key entered during init | Billed per token (claude-haiku-4-5 rates) |
Generate summaries
After configuring a provider via init, run:
larkx index --aiYou will see a token-cost warning and a confirmation prompt before anything runs. The command re-indexes first, then summarizes only files that don't have a cached summary yet.
Config reference
The AI provider is stored in .larkx/config.json:
{
"exclude": ["..."],
"entryPoints": [],
"ai": {
"provider": "local-claude"
}
}For "anthropic", an optional "apiKey" field is written if you entered a key during init (instead of using the env var).
Clearing the cache
Delete .larkx/summaries.json to force all files to be re-summarized on the next larkx index --ai run.