TriSeek is becoming the local context substrate that agent harnesses share. Today that means portability between Claude Code and Codex on the same machine. Tomorrow it means more harness adapters, cross-machine resume, audit, and replay — without giving up local-first or bounded output.
Anchor for the rest of this page. Everything below the next heading is forward-looking and explicitly tagged.
session_* MCP tools (open / status / list / close / handoff / snapshot / snapshot_list / snapshot_get / snapshot_diff / resume).~/.triseek/daemon/snapshots/, with manifest, working set, action log, pinned snippets, and git state..tcp) for moving snapshots between checkouts of the same repo.triseek brief --mode no-inference).Full reference: Context Handoff & Portability · v0.4.2 release notes.
Tagged honestly. Planned is committed. Designing means the shape is being worked out. Considering is driven by user demand, not pre-built. Nothing is tagged "Soon."
Same adapter contract as Claude Code and Codex. Writes the hydration payload to the harness's own project file, with snapshot capture wired into the harness's session lifecycle.
triseek replay <snapshot> reproduces a session deterministically against the same daemon state. triseek audit answers "what did the agent touch" without requiring chat-transcript access.
User-controlled sync of pack artifacts across machines. Probably file-based (drop a .tcp in a synced folder); design TBD. No vendor cloud, no telemetry.
Measures dead-end re-exploration rate on a corpus of recorded sessions. Without a real eval, "the briefing is good" is just vibes — we want a number.
Native PreCompact integration for Claude Code and the Codex equivalent, so a snapshot is captured automatically before the harness drops state. The hook contract is the open question, not the daemon work.
The CLI already accepts --mode local-model and --mode cloud-model; today both fall back to the no-inference template. Wiring these to user-configured endpoints (Ollama, Anthropic, OpenAI) is on deck. The shape of the config file is the design question.
Driven by user demand. Each new harness is its own adapter contract; we'll ship them as people ask, not preemptively.
Higher-fidelity working set, richer briefings. Real cost: indexer complexity and language-specific footprints. Worth doing only if the eval harness shows a meaningful quality lift.
Saying no is information. These are out of scope on purpose — not "not yet," but "not the point of this project."
The brand commitments are load-bearing. New features ship within these constraints, not around them.
The daemon makes no network calls. Briefing summarization is opt-in and uses a user-configured endpoint — not a Triseek service.
Every response clips to a budget. Agents can't be drowned by a tool that's "trying to be helpful."
The daemon is plain Rust over a trigram index, frecency state, and JSONL action logs. No model lives inside it.
State lives under ~/.triseek/, never inside the repo. Sessions and snapshots are scoped to the repo root they were opened against.
Source on GitHub. No closed-source binary, no telemetry, no phone-home, no licensing tier behind features that ship.
Every response says which backend ran (triseek_indexed, triseek_direct_scan, ripgrep_fallback) and whether memo/cache hit. Performance claims come with a benchmark and a methodology link.