What memory means

Zorai memory is not a single file, a vector store, or a convenience summary pasted into the next prompt. It is a layered daemon-owned memory system that combines identity, durable facts, operator profile, task and thread state, procedural memory, recall systems, provenance, and higher-order learning loops.

Short version: the daemon owns memory, memory is layered, durable writes are curated, and provenance makes those writes auditable instead of magical.

Memory is daemon-owned

The daemon is the authority for memory. Electron, the TUI, the CLI, MCP clients, and chat gateways are all clients of the same memory substrate.

  • Memory survives UI restarts
  • Multiple clients reconnect to the same state
  • Handoffs and subagents share durable context
  • Learning loops operate over persisted history, not only the current prompt

The memory stack

Identity memory

SOUL.md captures fire identity, principles, role boundaries, and stable specialization hints.

Durable fact memory

MEMORY.md stores curated project facts, conventions, strategy hints, and stable corrections.

Operator memory

USER.md reflects structured operator profile state, preferences, workflow signals, and onboarding/check-in knowledge.

Runtime memory

Threads, workspace tasks, execution queue entries, goals, reflections, checkpoints, collaboration sessions, and causal traces live in structured daemon-owned persistence.

Recall memory

History search, session search, OneContext recall, summaries, and injected context let different questions use different retrieval surfaces.

Procedural memory

Skills are memory too: installed skills, generated workflows, variants, usage tracking, promotion, deprecation, and merge behavior.

Semantic memory

Workspace structure, conventions, imports, packages, and temporal history form a graph-like memory surface beyond flat transcript recall.

Provenance-backed facts

Durable memory writes carry source, timestamps, fact keys, confirmation/retraction state, and relationships such as retracts.

How memory writes work

  1. A durable candidate emerges from a thread, task, goal reflection, operator profile update, or learning pass.
  2. The runtime decides which layer it belongs to.
  3. The write is bounded and normalized rather than dumping raw history.
  4. Contradiction or replacement rules are applied where relevant.
  5. The target memory artifact is updated.
  6. Provenance metadata is persisted alongside the write.
  7. Operator surfaces can later inspect, confirm, retract, or review the result.

Provenance, confirmation, and retraction

Durable memory in Zorai is not just append-only prose. The provenance-backed layer tracks target file, write mode, source kind, source scope, extracted fact keys, timestamps, confirmation state, retraction state, and explicit relationships between entries.

Why this matters: Zorai can answer where a fact came from, whether it is still trusted, and what later operation invalidated it.

Compaction and recall

Memory and compaction are tied together. Before older context falls out of the active window, Zorai can preserve durable signal in a longer-lived layer. That means compaction should reduce noise, not cause amnesia.

  • Important constraints should migrate into durable layers before they disappear from the active prompt.
  • LLM-backed compaction and memory curation are complementary.
  • Long-lived threads benefit from stronger compaction because preservation quality affects later recall quality.

What belongs in memory

Good memory material

Stable project facts, operator preferences, conventions, recurring corrections, durable strategy hints, and reusable workflow knowledge.

Bad memory material

Transient task progress, noisy run output, raw transcripts, approval state, and short-lived operational residue that can be rediscovered cheaply.

Why the memory model matters

If Zorai memory were only three markdown files, the system would collapse into manual note-keeping. The real model is stronger: curated files, daemon-owned runtime state, procedural memory through skills, semantic workspace memory, and provenance-backed durable facts working together.

For the trust and governance side of that model, see Security and Governance.