Skip to content

Nota Memoria Engine (NME)

Memory structuring service between intent processing and wave-based storage. Transforms unstructured input into structured, trait-based memories with deterministic encoding into RFS waveforms.

Overview

NME sits between intent processing and field storage. It takes raw, unstructured input and produces structured meaning — decomposed into 10 atomic traits, classified by memory type, and encoded as complex waveforms ready for Resonant Field Storage.

Meaning Extraction

Decomposes raw input into 10 atomic meaning components using a high-performance semantic role labeling pipeline. Handles passive voice, relative clauses, multi-verb coordination, and structural ambiguity at sub-millisecond latency.

Memory Classification

Organizes structured memories into four types: episodic (event-specific), semantic (abstracted knowledge), working (short-term active context), and long-term (persistent storage). Different memory types evolve differently in the field — episodic memories decay faster while semantic memories form stable attractors.

Deterministic Encoding

Transforms structured traits into complex-valued waveforms where phase encodes structure and magnitude encodes content. Every encoding step is bounded, deterministic, and formally verified through Mathematical Autopsy.


The 10 Components of Meaning

Three temporal groupings that work together to determine meaning — from within a single utterance, between utterances, and accumulated over time.

TraitGroupingDescription
StructurePer-UtteranceWho did what to whom — syntactic frames and semantic role labels
ContentPer-UtteranceWhat the words refer to — entities, denotational meaning
ModalityPer-UtteranceCertainty, possibility, obligation — epistemic status of claims
AffectPer-UtteranceEmotional coloring — sentiment, judgment, stance
RelationshipsBetween-UtteranceCausal, temporal, and contrastive links between events
ReferenceBetween-UtteranceCo-reference tracking — resolving pronouns and anaphora
IntentBetween-UtteranceWhy it was said — speech acts like assertions, requests, promises
ContextOver-TimeSurrounding discourse, situation, and prior conversation
World KnowledgeOver-TimeBackground assumptions and common-sense reasoning
Ambiguity ResolutionOver-TimeCollapsing meaning superposition via accumulated context

How Meaning Composes

Components 1-4 (Structure, Content, Modality, Affect) are directly encoded into each chunk's waveform. Components 5-7 (Relationships, Reference, Intent) connect chunks to each other as link operators. Components 8-10 (Context, World Knowledge, Ambiguity Resolution) emerge from the field itself after multiple inputs shape its topology. This is a fundamentally different approach to meaning — it is not a point in space, it is a wave in a field.


Key Capabilities

10-Head Meaning Encoder

Unified multi-task architecture with specialized extraction heads for each atomic trait

Titans Memory Classification

Automatic classification into episodic, semantic, working, and long-term memory types

Deterministic Waveform Encoding

Structured meaning converted to complex waveforms via Fourier Holographic Reduced Representations

SRL Chunking

Input decomposed into per-predicate semantic frames — the atomic unit of meaning

Ambiguity Preservation

Ambiguous parses stored as superposed waveforms, collapsed at query time by context

Trait Composition

Individual trait extractions compose into unified meaning representations through wave superposition


Use Cases

Persistent AI Memory

Give AI systems memory that persists across sessions with structured, retrievable meaning — not just raw text logs.

Knowledge Base Construction

Transform unstructured documents into structured knowledge with full meaning decomposition and relationship extraction.

Conversational Context

Maintain rich conversational context by extracting and encoding meaning from every interaction, enabling continuity over long time horizons.

Semantic Search Infrastructure

Power search systems that understand meaning structure — not just keyword matching — by querying over decomposed trait dimensions.


Active Research Frontier

Meaning Decomposition Research

NME is built on active research into computational models of meaning — how to decompose, represent, and compose semantic structure in ways that are both linguistically grounded and computationally tractable.

Current research explores unified multi-task architectures for simultaneous extraction of all meaning components, holographic encoding schemes that preserve compositional structure in fixed-width representations, and self-organizing field dynamics where memories with shared semantic structure naturally cluster through wave interference.

Explore Research

Research Areas

  • Unified multi-task meaning extraction architectures
  • Holographic encoding for compositional semantics
  • Self-organizing memory topology through wave physics
  • Ambiguity preservation and context-driven resolution
  • Cross-modal meaning alignment and transfer

Structured Memory for AI Systems

Transform unstructured input into meaning-rich, deterministically encoded memories.