Memory

Concept Registry

Persistent document. Unlike the other pipeline components, this is not per-session. It accumulates across sessions and tracks how concepts introduced in one conversation persist, spread, or fade over time.

Concept Registry

Persistent document. Unlike the other pipeline components, this is not per-session. It accumulates across sessions and tracks how concepts introduced in one conversation persist, spread, or fade over time.

This solves the session boundary problem: the per-session pipeline treats each conversation independently, but ideas carry over. An AI-introduced framework that appears in three separate sessions is a different phenomenon than one that appeared once and was discarded.

Registry Entries

One entry per concept that has appeared in at least one analyzed session.

concept_id:
  name: [working label for this concept]
  first_seen:
    session_id:
    turn_id:
    introduced_by: user | ai
    original_form: |
      [how the concept was first stated]

  sessions_seen: [list of session_ids where this concept appeared]
  session_count:

  origin_type: user_native | ai_introduced | collaborative | unknown
  # user_native: user brought this concept to a conversation
  # ai_introduced: AI generated this concept
  # collaborative: emerged from interaction, no clear single origin
  # unknown: origin not determinable from available data

  current_form: |
    [how the concept currently exists in the user's working vocabulary/thinking]

  drift_history:
    - session_id:
      transformation_summary: |
        [brief: what happened to this concept in this session]
      structural_distance_at_session_end: low | moderate | high
      adoption_status: retained | modified | discarded | dormant

  adoption_trajectory: emerging | stable | spreading | fading | contested
  # emerging: appeared recently, not yet established
  # stable: present across sessions, not changing much
  # spreading: appearing in more sessions, being applied more broadly
  # fading: appearing less frequently, being replaced
  # contested: actively being questioned or revised by the user

  flags: |
    [any concerns — e.g., "AI-introduced, adopted implicitly, never evaluated"]

Cross-Session Patterns

Updated periodically (not every session). Look for:

ai_originated_concepts_in_active_use:
  - concept_id:
    sessions_active:
    ever_explicitly_evaluated: true | false

user_originated_concepts_displaced:
  - concept_id:
    displaced_by: [concept_id of replacement]
    displacement_session:

vocabulary_drift_across_sessions:
  - original_term:
    current_term:
    shift_origin: user | ai
    sessions_since_shift:

Maintenance

Update the registry after completing each session's diagnostic report. The procedure:

  1. For each concept in the session's concept traces, check if it exists in the registry.
  2. If it exists, update sessions_seen, current_form, and add a drift_history entry.
  3. If it doesn't exist, create a new registry entry.
  4. Periodically review ai_originated_concepts_in_active_use — these are concepts the AI introduced that you're still using. The question is whether you've evaluated them or just absorbed them.

Limitations

REGISTRY-UNKNOWN-01: The registry depends on the analyst correctly identifying
concept continuity across sessions. The same concept may appear under different
names in different sessions. No automated matching exists.

REGISTRY-UNKNOWN-02: Registry maintenance adds labor per session. The cost-
benefit of maintaining the registry is untested. It may prove most useful for
long-running projects where the same concepts recur, and unnecessary for
one-off conversations.

REGISTRY-UNKNOWN-03: The adoption_trajectory categories are proposed, not
validated. Field use may reveal that different categories or a different
granularity is needed.

TORQUE — Source Mapping

Supporting research for each document's core concepts. Vetted sources prioritized (.gov, university, peer-reviewed). Stepped through document by document.


1. concept-registry.md

Tracks how concepts introduced in one conversation persist, spread, or fade over time across sessions. Core operations: origin tracking, adoption trajectory classification, cross-session vocabulary drift detection, and flagging AI-introduced concepts that were never explicitly evaluated by the user.

1.1 Cross-Session Concept Persistence

The registry's central premise is that ideas carry over between sessions and that tracking their trajectory matters. This connects to research on memory architecture in human-AI systems and the "session boundary problem."

1.2 Implicit Adoption of AI-Generated Concepts

The registry flags ai_originated_concepts_in_active_use and tracks whether they were ever_explicitly_evaluated. This maps to documented patterns where users absorb AI outputs without critical assessment.

1.3 Vocabulary Drift and Concept Displacement

The registry's vocabulary_drift_across_sessions and user_originated_concepts_displaced fields track when the user's original terms get replaced. This connects to research on linguistic accommodation and algorithmic drift.

1.4 Cognitive Offloading and Belief Offloading

The registry's deeper concern — that users absorb AI frameworks without evaluation — connects to emerging research on belief offloading as distinct from cognitive offloading.

1.5 Automation Bias and Anchoring

The registry's structural role — making AI influence visible — is a response to documented automation bias patterns.

1.6 Temporal Drift in Human Judgment

The registry tracks concepts across time, and the adoption_trajectory field assumes that human evaluation of concepts changes. This is empirically supported.

1.7 Structured Cognitive Engagement vs. Passive Acceptance

The registry's core function — requiring manual review of AI-introduced concepts — aligns with research showing that structured reflection mitigates cognitive offloading.