Memory Types
Engram models human memory with three complementary stores, each optimised for a different kind of knowledge. Every memory belongs to exactly one type, which influences how it is scored during recall and how long it is retained.
● Episodic Memory
Episodic memories are timestamped events — the raw history of what happened, when, and with whom. They capture the narrative thread of a project or relationship.
Examples:
- A user asked how to configure environment variables on 2025-03-12
- The team decided to use Turborepo on 2025-02-28
- A bug in the auth middleware was discovered and fixed
- The v0.1.0 release was published to GitHub
Schema
{
"type": "episodic",
"content": "User asked about TypeScript strict mode configuration",
"importance": 0.7,
"source": "claude-code", // optional — which tool created it
"concept": null, // null for episodic (label is auto-generated)
"createdAt": "2025-03-21T..."
}Recall behaviour
Episodic memories are ranked by a combination of recency andsemantic similarity. A highly similar but old memory scores lower than a moderately similar recent one. This mirrors how humans more easily recall recent events.
● Semantic Memory
Semantic memories are timeless facts — stable knowledge about the world, a codebase, a person, or a project. They are the "what you know" store, independent of when you learned it.
Examples:
- The project uses pnpm workspaces with Turborepo
- The API is built with Fastify, not Express
- Alice is the lead engineer on the authentication service
- Engram stores 384-dimensional embeddings using all-MiniLM-L6-v2
Schema
{
"type": "semantic",
"content": "The project uses pnpm workspaces managed by Turborepo",
"concept": "Monorepo Architecture", // human-readable label (shown in graph)
"importance": 0.85,
"source": "add_knowledge"
}Knowledge graph
Semantic memories form a knowledge graph. Engram automatically detects overlapping topics between memories and creates edges — so recalling "Turborepo" can also surface memories about "pnpm" and "monorepo structure" even if the query didn't mention them.
● Procedural Memory
Procedural memories are trigger → action patterns — learned behaviours and workflows that should be applied when certain conditions are met.
Examples:
- When schema changes → run
drizzle-kit generatethendrizzle-kit migrate - When a port conflict occurs →
fuser -k PORT/tcpthen restart - When writing a new feature → always write tests first
- When embedding a document → tokenize → ONNX inference → FP16 compress → store
Schema
{
"type": "procedural",
"content": "When schema changes → run drizzle-kit generate then drizzle-kit migrate",
"concept": "Migration pattern", // short label for the skill/pattern
"importance": 0.80,
"source": "claude-code"
}How procedural recall works
When the AI sends a query about how to do something, Engram matches procedural memories using intent detection. The recall pipeline embeds the query, performs HNSW vector search, then scores procedural results with an additionalpattern-match bonus for trigger-like queries.
Importance scores
Every memory has an importance value between 0.0 and 1.0. It influences:
- Recall ranking — higher importance memories rank higher when similarity is equal
- Decay rate — high importance memories decay more slowly under the forgetting curve
- Pruning — when
maxMemoriesis reached, low importance memories are pruned first - Visual size — in the 3D dashboard, neuron size corresponds to importance
| Range | Effective Half-Life | Guideline |
|---|---|---|
| 0.9 – 1.0 | ~28 days | Critical facts — protected from auto-decay if semantic |
| 0.7 – 0.9 | ~14 days | Important knowledge used frequently |
| 0.5 – 0.7 | ~7 days | Useful context, referenced occasionally |
| 0.3 – 0.5 | ~3 days | Low-value or highly specific information |
| 0.0 – 0.3 | ~1 day | Ephemeral — will decay quickly and be archived first |
Decay mechanics
Engram uses an Ebbinghaus forgetting curve to compute a retention score for each memory: importance × recencyFactor × accessFactor. When this score drops below the archive threshold (default 0.05), the memory is soft-deleted.
Between sweeps, importance itself is progressively reduced at 1% per day without access, with a floor of 0.05. Memories that are accessed frequently resist decay — each recall boosts importance by 2%.
"pinned" or "protected" to exempt it from decay entirely. Semantic memories with importance ≥ 0.8 are also protected by default.