The compliance challenge
An AI agent that remembers a user is, by legal definition under GDPR, processing personal data. The moment you store "User Alice prefers async Python and works in healthcare IT in Lyon," you hold a record about an identifiable individual. That triggers the full GDPR framework.
Most developers building agents in 2024–2026 reach for a managed memory API — Mem0, Zep, or similar — because running your own vector database is legitimate engineering overhead. The problem is that most of these services are US-hosted, which creates a data transfer compliance problem on top of the data processing one.
This article covers both: what GDPR specifically requires for agent memory, and how to implement it in a way that holds up to a DPA audit.
What GDPR requires for AI agent memory
Art. 5 — Data minimization and purpose limitation
Article 5 requires that personal data be "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed" (the data minimization principle) and collected "for specified, explicit and legitimate purposes."
What this means for agent memory:
- You cannot store everything a user says and call it "memory." Each stored memory needs a legitimate purpose.
- LLM-based auto-extraction approaches (like Mem0's default mode) are harder to defend under Art. 5 because the extraction logic is opaque — you may end up storing sensitive data you didn't intend to store.
- Explicit
remember()calls where your application decides what to store are more defensible: you can document exactly what categories of data are stored and why.
Practical implementation:
# Defensible: explicit storage with documented purpose
kv.remember(
agent_id="support-agent",
content="User's preferred contact language: French", # purpose: localization
user_id="user-123",
)
# Harder to defend under Art. 5: storing raw conversation turns verbatim
# Auto-extraction may inadvertently capture Art. 9 special category data
# (health, political opinions, etc.)
Art. 17 — Right to erasure (right to be forgotten)
Users have the right to request deletion of all personal data you hold about them. For agent memory, this means: when a user asks to be forgotten, every stored memory associated with their identity must be deleted, verifiably, and the deletion must be confirmable.
What this requires technically:
- A
user_idattached to every stored memory (so you can find them all) - A deletion endpoint that removes all records for that
user_id - A response that confirms how many records were deleted (for your compliance log)
import httpx
def handle_erasure_request(user_id: str, api_key: str) -> dict:
"""
GDPR Art. 17 — handle a user's right to erasure request.
Call this when a user submits a deletion request through your app.
Log the response for your compliance records.
"""
response = httpx.delete(
"https://api.kronvex.io/api/v1/gdpr/erasure",
headers={"X-API-Key": api_key},
json={"user_id": user_id},
)
response.raise_for_status()
result = response.json()
# result = {"deleted_memories": 42, "user_id": "user-123", "status": "erased"}
# Log for your compliance audit trail
compliance_log.write({
"event": "gdpr_erasure",
"user_id": user_id,
"deleted_records": result["deleted_memories"],
"timestamp": datetime.utcnow().isoformat(),
"confirmed": result["status"] == "erased",
})
return result
Art. 20 — Right to data portability
Users have the right to receive their personal data "in a structured, commonly used and machine-readable format" and to transmit it to another controller. For agent memory, this means you must be able to export all memories for a given user.
def handle_portability_request(user_id: str, api_key: str) -> dict:
"""
GDPR Art. 20 — export all user data in structured format.
Return this JSON to the user or transmit to another controller.
"""
response = httpx.get(
"https://api.kronvex.io/api/v1/gdpr/export",
headers={"X-API-Key": api_key},
params={"user_id": user_id},
)
response.raise_for_status()
export_data = response.json()
# export_data = {
# "user_id": "user-123",
# "exported_at": "2026-04-06T10:00:00Z",
# "memories": [
# {"id": "mem-abc", "content": "...", "created_at": "...", "agent_id": "..."},
# ...
# ]
# }
return export_data
Art. 28 — Data Processing Agreement
Before any third-party service processes personal data on your behalf, you must have a Data Processing Agreement in place. This is non-negotiable under GDPR. For every memory API you use, you need to check:
- Do they publish or provide a DPA?
- Where is the data hosted (and does that create a third-country transfer issue)?
- Do they specify sub-processors?
- Do they offer deletion and portability mechanisms?
Compliance gap: If you cannot answer all four questions affirmatively for your memory provider, you have an Art. 28 compliance gap that should be resolved before going to production with personal data.
Why US-hosted memory APIs create problems
The Schrems II problem
In July 2020, the CJEU invalidated the EU-US Privacy Shield framework (Schrems II, C-311/18), finding that US surveillance law does not provide "essentially equivalent" protection to EU data protection rights. The ruling specifically cited Section 702 FISA and Executive Order 12333 as incompatible with GDPR.
Standard Contractual Clauses (SCCs) remain a valid transfer mechanism, but only with a Transfer Impact Assessment (TIA) showing that the SCCs provide effective protection in practice. For US cloud services subject to the CLOUD Act and FISA Section 702, TIAs are increasingly difficult to sign off on — especially after several EU DPAs (CNIL, Garante, BfDI) have found that SCCs are insufficient without technical supplementary measures.
What this means for memory APIs: If your memory API is US-hosted, you have an international data transfer to analyze. If you're a solo developer selling to consumers, this may not matter today. If you're a European SaaS company with enterprise clients, your procurement process will encounter this.
The CLOUD Act risk
The US Clarifying Lawful Overseas Use of Data (CLOUD) Act allows US law enforcement to compel US companies to produce data stored overseas. A US-incorporated company running servers in Germany can still be compelled to disclose data under the CLOUD Act. This is not hypothetical — it's been the subject of DPA guidance in multiple EU member states.
Critical distinction: "EU region" ≠ "EU-safe" if the company is US-incorporated. You need data hosted by an EU-incorporated entity under EU jurisdiction.
Practical advice
The safest path for regulated EU use cases:
- Use a memory provider that is EU-incorporated and EU-hosted (not just an EU region of a US company)
- Obtain a signed DPA before going to production
- Implement Art. 17 and Art. 20 endpoints in your application layer
- Document your lawful basis for processing (usually legitimate interests or contract performance)
Technical implementation with Kronvex
Kronvex runs on Supabase Frankfurt (EU), with Art. 17 and Art. 20 endpoints built in. Here's a complete GDPR-compliant implementation pattern.
Store memories with user_id scoping
from kronvex import KronvexClient
from datetime import datetime
kv = KronvexClient(api_key="kv-your-key", base_url="https://api.kronvex.io")
AGENT_ID = "your-agent"
def store_user_memory(user_id: str, content: str, session_id: str | None = None):
"""
Store a memory with user_id for GDPR traceability.
Only store what has a documented purpose.
"""
kv.remember(
agent_id=AGENT_ID,
content=content,
user_id=user_id,
session_id=session_id,
)
Recall with user isolation
def get_relevant_context(user_id: str, query: str) -> str:
"""
Recall memories scoped to this user only.
No cross-user data leakage by default.
"""
context = kv.inject_context(
agent_id=AGENT_ID,
query=query,
user_id=user_id,
)
return context.get("context", "")
Art. 17 erasure handler
import httpx
import logging
logger = logging.getLogger(__name__)
def process_erasure_request(user_id: str) -> bool:
"""Call when user exercises Art. 17 right to erasure."""
try:
response = httpx.delete(
"https://api.kronvex.io/api/v1/gdpr/erasure",
headers={"X-API-Key": "kv-your-key"},
json={"user_id": user_id},
timeout=30,
)
response.raise_for_status()
result = response.json()
logger.info(
"GDPR erasure completed",
extra={
"user_id": user_id,
"deleted_memories": result["deleted_memories"],
"timestamp": datetime.utcnow().isoformat(),
}
)
return True
except httpx.HTTPError as e:
logger.error(f"GDPR erasure failed for {user_id}: {e}")
raise
Art. 20 portability handler
def export_user_memories(user_id: str) -> dict:
"""Return structured export of all user memories."""
response = httpx.get(
"https://api.kronvex.io/api/v1/gdpr/export",
headers={"X-API-Key": "kv-your-key"},
params={"user_id": user_id},
)
response.raise_for_status()
return response.json()
Sector-specific notes
Healthcare (HDS in France, DSGVO in Germany)
Health data is an Art. 9 special category. If your agent processes or stores any health-related information — symptoms, diagnoses, medications, appointments — you need explicit consent as the lawful basis (not legitimate interests), and your memory provider must either be HDS-certified (France) or meet equivalent requirements.
Key considerations:
- Never store raw health conversations verbatim. Extract only the minimum necessary.
- Your DPA must explicitly cover special category data processing.
- Encryption at rest and in transit is a baseline requirement, not optional.
- In France, hosting personal health data requires HDS certification. Cloud services operating outside this framework are not usable for health data, regardless of GDPR compliance.
Fintech (DORA)
The Digital Operational Resilience Act (DORA) applies from January 2025 across EU financial entities. Key implications for AI agent memory:
- ICT service providers used in critical functions must appear in your register of ICT third-party providers.
- Contractual arrangements must include specific provisions about data location, audit rights, and exit plans.
- You must be able to demonstrate operational resilience, including what happens if your memory provider has an outage.
DORA practical advice: For production fintech deployments, ensure your memory provider can provide SLA documentation and accepts DORA-compliant contractual clauses in their DPA.
Summary: compliance checklist
Before going to production with agent memory:
- Documented lawful basis for storing each category of memory data
user_idattached to every stored memory (Art. 17 prerequisite)- Signed DPA with your memory provider
- Confirmed data residency: EU-hosted, EU-incorporated entity
- Art. 17 erasure endpoint implemented and tested
- Art. 20 portability endpoint implemented and tested
- Deletion requests logged for compliance audit trail
- No special category data (Art. 9) stored without explicit consent
- For healthcare: HDS certification confirmed or alternative arrangement documented
- For fintech: memory provider in ICT third-party register, DORA-compliant contract terms
Get a demo key
Kronvex provides EU-hosted agent memory with GDPR endpoints built in. Demo key — no credit card:
curl -X POST https://api.kronvex.io/auth/demo
# Returns: {"api_key": "kv-demo-..."}
DPA available on request. Full documentation: https://kronvex.io/docs