Manifesto Authors Contribute GitHub
A Declaration of Digital Rights

The AI Memory
Manifesto

On the rights over the memory that AI systems build about us.

Scroll
data extraction

You share everything. Here’s what you get back.

Message AI...
Click the input to try it yourself — type anything personal
What they keep
Provider Memory
emotion: overwhelmed life event: divorce intent: resignation employer: Meridian Corp health: ADHD diagnosis family: parent idea: therapist platform business: independent therapists emotion: falling in love relationship: best friend preference: dark mode routine: morning journal fear: custody battle goal: career change health: anxiety finance: savings concern belief: therapy works location: Bay Area habit: late nights emotion: lonely social: few close friends intent: find therapist family: son age 8 media: reads Stoic philosophy work: senior engineer conflict: work-life balance dream: write a book health: insomnia relationship: complicated coping: running emotion: hopeful parenting: ADHD support identity: introvert plan: move cities tech: React developer fear: being vulnerable memory: childhood value: independence emotion: grieving aspiration: startup founder
Access denied
What you get
memory.md
- Prefers dark mode - Has a child - Interested in startups
Inferred Profile — Synthesized from conversations

You are likely a senior engineer in the Bay Area, going through a divorce with concerns about custody of your 8-year-old son. You experience anxiety and insomnia, cope by running, and read Stoic philosophy. You are considering leaving your job at Meridian Corp and exploring a startup idea for independent therapists. You identify as an introvert who is falling in love with a close friend and dreams of writing a book. You are hopeful but vulnerable.

They remember everything. You own almost nothing.

But it doesn’t have to be this way.

AI systems are beginning to remember us.

They remember our preferences, our goals, our habits, our vulnerabilities, and the patterns they infer from everything we say and do.

This memory is becoming a new layer of digital identity — one that shapes how systems respond to us, what they assume about us, and what version of us they preserve over time.

That memory must not be trapped by default.

It must not be owned in silence.

It must belong to the people whose lives it represents.

01
The right to knowKnow whether an AI stores memory about you, what kinds, and what it's used for.
02
The right to accessInspect your full memory in human-legible and machine-readable form.
03
The right to portabilityExport and transfer your memory across systems without lock-in or degradation.
04
The right to distinctionClear separation between stated facts, preferences, summaries, and inferences.
05
The right to provenanceEvery memory must carry its origin, how it was produced, and when it changed.
06
The right to correctionCorrect false, outdated, or misleading memories about you.
07
The right to contest inferenceChallenge inferred memories. No inference should become silent truth.
08
The right to deletionDelete memories and their derivatives, not just surface-level records.
09
The right to expirationMemory should not be permanent by default. Define what fades and expires.
10
The right to selective disclosureShare only the memory needed for a given task or relationship.
11
The right to user-controlled storageStore your AI memory in your own vaults, devices, or trusted custodians.
12
The right to auditabilitySee who accessed your memory, when, and for what purpose.
13
The right to a memory-free modeNo one should be forced into persistent memory as the price of using AI.
14
The right not to be ruled by secret memoryNo hidden memory profiles influencing outcomes without your visibility.
15
The right to meaningful control over reuseMemory from one relationship must not be silently repurposed elsewhere.
16
The right to freedom from memory-based manipulationNo system should use your memory to exploit vulnerabilities or steer choices.

Every major AI provider is building persistent memory right now. The user gets personalization. The provider gets lock-in.

There is no portable format. No meaningful export. No right to inspect what a system has inferred about you.

The norms are not yet fixed. The architectures are not yet inevitable. This is the window.

What if you could take it back?

my-memory.json — portable, yours

Memory must belong
to the user.