A novel agentic memory system for LLM agents that can dynamically organize memories in an agentic way.
Large Language Model (LLM) agents have demonstrated remarkable capabilities in handling complex real-world tasks through external tool usage. However, to effectively leverage historical experiences, they require sophisticated memory systems. Traditional memory systems, while providing basic storage and retrieval functionality, often lack advanced memory organization capabilities.
Our project introduces an innovative Agentic Memory system that revolutionizes how LLM agents manage and utilize their memories:
Comparison between traditional memory system (top) and our proposed agentic memory (bottom). Our system enables dynamic memory operations and flexible agent-memory interactions.
Note: This repository provides a memory system to facilitate agent construction. If you want to reproduce the results presented in our paper, please refer to: https://github.com/WujiangXu/AgenticMemory
For more details, please refer to our paper: A-MEM: Agentic Memory for LLM Agents
- π Dynamic memory organization based on Zettelkasten principles
- π Intelligent indexing and linking of memories via ChromaDB
- π Comprehensive note generation with structured attributes
- π Interconnected knowledge networks
- 𧬠Continuous memory evolution and refinement
- π€ Agent-driven decision making for adaptive memory management
The framework of our Agentic Memory system showing the dynamic interaction between LLM agents and memory components.
When a new memory is added to the system:
- LLM Analysis: Automatically analyzes content to generate keywords, context, and tags (if not provided)
- Enhanced Embedding: Creates vector embeddings using both content and generated metadata for superior retrieval
- Semantic Storage: Stores memories in ChromaDB with rich semantic information
- Relationship Analysis: Analyzes historical memories for relevant connections using enhanced embeddings
- Dynamic Linking: Establishes meaningful links based on content and metadata similarities
- Memory Evolution: Enables continuous memory evolution and updates through intelligent analysis
Empirical experiments conducted on six foundation models demonstrate superior performance compared to existing SOTA baselines.
- Clone the repository:
git clone https://github.com/agiresearch/A-mem.git
cd A-mem
- Install dependencies: Create and activate a virtual environment (recommended):
python -m venv .venv
source .venv/bin/activate # On Windows, use: .venv\Scripts\activate
Install the package:
pip install .
For development, you can install it in editable mode:
pip install -e .
- Usage Examples π‘
Here's how to use the Agentic Memory system for basic operations:
from agentic_memory.memory_system import AgenticMemorySystem
# Initialize the memory system π
memory_system = AgenticMemorySystem(
model_name='all-MiniLM-L6-v2', # Embedding model for ChromaDB
llm_backend="openai", # LLM backend (openai/ollama)
llm_model="gpt-4o-mini" # LLM model name
)
# Add Memories with Automatic LLM Analysis β¨
# Simple addition - LLM automatically generates keywords, context, and tags
memory_id1 = memory_system.add_note(
"Machine learning algorithms use neural networks to process complex datasets and identify patterns."
)
# Check the automatically generated metadata
memory = memory_system.read(memory_id1)
print(f"Content: {memory.content}")
print(f"Auto-generated Keywords: {memory.keywords}") # e.g., ['machine learning', 'neural networks', 'datasets']
print(f"Auto-generated Context: {memory.context}") # e.g., "Discussion about ML algorithms and data processing"
print(f"Auto-generated Tags: {memory.tags}") # e.g., ['artificial intelligence', 'data science', 'technology']
# Partial metadata provision - LLM fills in missing attributes
memory_id2 = memory_system.add_note(
content="Python is excellent for data science applications",
keywords=["Python", "programming"] # Provide keywords, LLM will generate context and tags
)
# Manual metadata provision - no LLM analysis needed
memory_id3 = memory_system.add_note(
content="Project meeting notes for Q1 review",
keywords=["meeting", "project", "review"],
context="Business project management discussion",
tags=["business", "project", "meeting"],
timestamp="202503021500" # YYYYMMDDHHmm format
)
# Enhanced Retrieval with Metadata π
# The system now uses generated metadata for better semantic search
results = memory_system.search("artificial intelligence data processing", k=3)
for result in results:
print(f"ID: {result['id']}")
print(f"Content: {result['content']}")
print(f"Keywords: {result['keywords']}")
print(f"Tags: {result['tags']}")
print(f"Relevance Score: {result.get('score', 'N/A')}")
print("---")
# Alternative search methods
results = memory_system.search_agentic("neural networks", k=5)
for result in results:
print(f"ID: {result['id']}")
print(f"Content: {result['content'][:100]}...")
print(f"Tags: {result['tags']}")
print("---")
# Update Memories π
memory_system.update(memory_id1, content="Updated: Deep learning neural networks for pattern recognition")
# Delete Memories β
memory_system.delete(memory_id3)
# Memory Evolution π§¬
# The system automatically evolves memories by:
# 1. Using LLM to analyze content and generate semantic metadata
# 2. Finding relationships using enhanced ChromaDB embeddings (content + metadata)
# 3. Updating tags, context, and connections based on related memories
# 4. Creating semantic links between memories
# This happens automatically when adding or updating memories!
-
Intelligent LLM Analysis π§
- Automatic keyword extraction from content
- Context generation based on semantic understanding
- Smart tag assignment for categorization
- Seamless integration with OpenAI and Ollama backends
-
Enhanced ChromaDB Vector Storage π¦
- Embedding generation using content + metadata for superior semantic search
- Fast similarity search leveraging both content and generated attributes
- Automatic metadata serialization and handling
- Persistent memory storage with rich semantic information
-
Memory Evolution π§¬
- Automatically analyzes content relationships using LLM-generated metadata
- Updates tags and context based on related memories
- Creates semantic connections between memories
- Dynamic memory organization with improved accuracy
-
Flexible Metadata Management π
- Auto-generation when not provided (keywords, context, tags)
- Manual override support for custom metadata
- Partial metadata completion (LLM fills missing attributes)
- Timestamp tracking and retrieval count monitoring
-
Multiple LLM Backends π€
- OpenAI (GPT-4, GPT-4o-mini, GPT-3.5)
- Ollama (for local deployment)
- Configurable model selection for analysis and evolution
-
Memory Creation β¨:
- Provide clear, descriptive content for better LLM analysis
- Let the system auto-generate metadata for optimal semantic richness
- Use partial metadata provision when you want to control specific attributes
- Provide manual metadata only when you need precise control
-
Memory Retrieval π:
- Leverage semantic search with natural language queries
- Use specific domain terminology that matches generated keywords
- Adjust 'k' parameter based on needed results (typically 3-10)
- Take advantage of enhanced retrieval using both content and metadata
-
Memory Evolution π§¬:
- Allow automatic evolution to maximize memory organization
- Review LLM-generated metadata periodically for accuracy
- Use consistent domain-specific terminology for better clustering
- Monitor memory connections to understand knowledge relationships
-
LLM Integration π€:
- Ensure API keys are properly configured for your chosen backend
- Use gpt-4o-mini for cost-effective analysis or gpt-4 for higher quality
- Consider Ollama for local deployment and privacy requirements
- Monitor LLM usage for cost management
-
Error Handling
β οΈ :- Always check return values and handle None responses
- Handle potential KeyError for non-existent memories
- Use try-except blocks for LLM operations (network/API failures)
- Implement fallback behavior when LLM analysis fails
If you use this code in your research, please cite our work:
@article{xu2025mem,
title={A-mem: Agentic memory for llm agents},
author={Xu, Wujiang and Liang, Zujie and Mei, Kai and Gao, Hang and Tan, Juntao and Zhang, Yongfeng},
journal={arXiv preprint arXiv:2502.12110},
year={2025}
}
This project is licensed under the MIT License. See LICENSE for details.