A powerful, modular Python tool to automatically back up your Notion workspace with pluggable storage backends and notification systems.
- π Export entire Notion workspace - Complete backup of all your data
- π¦ Pluggable storage backends - Local, rclone (cloud storage), with more coming
- π§ Rich notifications - Apprise integration for 70+ notification services
- π§ Highly configurable - Customize every aspect via environment variables
- π Detailed logging - Comprehensive logging for debugging and monitoring
- π Secure - Token-based authentication with Notion
- β‘ Async/await support - Fast, concurrent operations
- π§Ή Automatic cleanup - Keep only the most recent backups
- π Multiple export formats - Markdown or HTML output
- π Retry logic - Robust error handling with exponential backoff
- π¬ Smart notification management - Automatically mark export notifications as read
- π Export Recovery - Optional Redis integration to recover from rare Notion notification failures
The tool is built with a modular, pluggable architecture:
src/notion_backup/
βββ core/ # Core backup logic
β βββ client.py # Notion API client
β βββ backup.py # Main backup orchestrator
βββ storage/ # Storage backends
β βββ local.py # Local file storage
β βββ rclone.py # Rclone cloud storage
βββ notifiers/ # Notification backends
β βββ apprise.py # Apprise notifications
βββ config/ # Configuration management
βββ utils/ # Utility functions
βββ redis_client.py # Optional Redis client for export recovery
# Clone the repository
git clone https://github.com/nikhilbadyal/notion.git
cd notion-backup
# Install dependencies
pip install -r requirements.txt
Copy the example environment file:
cp .env.example .env
- Login to Notion
- Open browser developer console β Network tab
- Use "Quick Find" and search for something
- Find the
search
request and copy:spaceId
βNOTION_SPACE_ID
token_v2
cookie βNOTION_TOKEN_V2
file_token
cookie βNOTION_FILE_TOKEN
Edit your .env
file with your credentials:
# Required
NOTION_SPACE_ID=your_actual_space_id_here
NOTION_TOKEN_V2=your_actual_token_v2_here
NOTION_FILE_TOKEN=your_file_token
# Optional - defaults shown
STORAGE_BACKEND=local
EXPORT_TYPE=markdown
LOCAL_PATH=./downloads
# Simple backup
python main.py backup
# With debug logging
python main.py --debug backup
# List available backups
python main.py list
# Cleanup old backups (keep 5 most recent)
python main.py cleanup --keep 5
This project is fully containerized, allowing you to run the backup tool in a consistent and isolated environment.
-
Configure Environment: Create a
.env
file by copying the.env.example
and filling in your Notion credentials and other settings. -
Build and Run: Use
docker-compose
to build the image and run the backup service.# Build the Docker image docker-compose build # Run a one-off backup docker-compose run --rm notion-backup
To run backups on a schedule, you can use a standard cron job on your host machine to execute the docker-compose run
command.
Example Cron Job (daily at 2 AM):
0 2 * * * cd /path/to/notion-backup && /usr/local/bin/docker-compose run --rm notion-backup
Store backups on local filesystem:
STORAGE_BACKEND=local
LOCAL_PATH=./downloads
MAX_BACKUPS=10
Store backups on any cloud provider supported by rclone:
STORAGE_BACKEND=rclone
RCLONE_REMOTE=mycloud
RCLONE_PATH=notion-backups
RCLONE_CONFIG_PATH=/path/to/rclone.conf
KEEP_LOCAL_BACKUP=true
Get notified about backup status via 70+ services using Apprise:
ENABLE_NOTIFICATIONS=true
NOTIFICATION_LEVEL=all
APPRISE_URLS=discord://webhook_id/webhook_token,mailto://user:pass@smtp.gmail.com?to=you@gmail.com
Popular notification services:
- Discord:
discord://webhook_id/webhook_token
- Slack:
slack://TokenA/TokenB/TokenC/Channel
- Email:
mailto://user:pass@domain.com?to=recipient@domain.com
- Telegram:
tgram://bottoken/ChatID
- Microsoft Teams:
msteams://TokenA/TokenB/TokenC/
- PushBullet:
pbul://accesstoken
See full list of supported services
Variable | Description |
---|---|
NOTION_SPACE_ID |
Your Notion workspace ID |
NOTION_TOKEN_V2 |
Notion authentication token |
NOTION_FILE_TOKEN |
Notion file download token |
Variable | Default | Options | Description |
---|---|---|---|
EXPORT_TYPE |
markdown |
markdown , html |
Export format |
FLATTEN_EXPORT_FILETREE |
false |
true , false |
Flatten nested pages |
EXPORT_COMMENTS |
true |
true , false |
Include comments |
TIME_ZONE |
UTC |
Any valid TZ name | Timezone for export |
Variable | Default | Description |
---|---|---|
STORAGE_BACKEND |
local |
local , rclone |
LOCAL_PATH |
./downloads |
Local storage directory |
MAX_BACKUPS |
None |
Max local backups to keep |
Variable | Default | Description |
---|---|---|
RCLONE_REMOTE |
None |
Rclone remote name |
RCLONE_PATH |
notion-backups |
Path on remote |
RCLONE_CONFIG_PATH |
None |
Rclone config file path |
KEEP_LOCAL_BACKUP |
true |
Keep local copy after upload |
Variable | Default | Description |
---|---|---|
ENABLE_NOTIFICATIONS |
false |
Enable notifications |
NOTIFICATION_LEVEL |
all |
success , error , all , none |
APPRISE_URLS |
"" |
Comma-separated notification URLs |
NOTIFICATION_TITLE |
Notion Backup |
Notification title prefix |
Variable | Default | Description |
---|---|---|
MAX_RETRIES |
3 |
Max retry attempts |
RETRY_DELAY |
5 |
Delay between retries (seconds) |
DOWNLOAD_TIMEOUT |
300 |
Download timeout (seconds) |
MARK_NOTIFICATIONS_AS_READ |
true |
Mark export notifications as read after download |
ARCHIVE_NOTIFICATION |
false |
Archive export notification after upload |
This feature uses Redis to recover from rare cases where a Notion export succeeds but the completion notification isn't received. When enabled, failed notifications are queued in Redis and processed on subsequent backup runs.
Variable | Default | Description |
---|---|---|
REDIS_HOST |
None | Redis server hostname (enables feature) |
REDIS_PORT |
6379 | Redis port |
REDIS_DB |
0 | Redis database number (0-15) |
REDIS_USERNAME |
None | Redis username (for ACL, Redis 6+) |
REDIS_PASSWORD |
None | Redis password (optional) |
REDIS_SSL |
false | Enable SSL/TLS for Redis connection |
REDIS_SSL_CA_CERTS |
None | Path to Redis CA certificate file |
REDIS_SSL_CERT_REQS |
"required" | SSL certificate requirements ('required', 'optional', 'none') |
Note: Redis is completely optional. If REDIS_HOST
is not set, the feature is disabled and backups proceed normally. Install Redis via your package manager or Docker for production use. For TLS-enabled Redis servers (common in cloud providers), set REDIS_SSL=true
and provide certificate details if required.
# Run backup (default command)
python main.py
python main.py backup
# List available backups
python main.py list
# Clean up old backups
python main.py cleanup --keep 5
# Test configuration
python main.py test
# Enable debug logging
python main.py --debug backup
- Create a new class inheriting from
AbstractStorage
- Implement required methods:
store
,list_backups
,cleanup_old_backups
,test_connection
- Register in
BackupManager._create_storage_backend()
- Create a new class inheriting from
AbstractNotifier
- Implement required methods:
send_notification
,test_connection
- Register in
BackupManager._create_notifier()
# .env
NOTION_SPACE_ID=your_space_id
NOTION_TOKEN_V2=your_token
NOTION_FILE_TOKEN=your_file_token
STORAGE_BACKEND=local
LOCAL_PATH=./backups
MAX_BACKUPS=7
ENABLE_NOTIFICATIONS=true
APPRISE_URLS=mailto://user:pass@gmail.com?to=you@gmail.com
MARK_NOTIFICATIONS_AS_READ=true
# .env
NOTION_SPACE_ID=your_space_id
NOTION_TOKEN_V2=your_token
NOTION_FILE_TOKEN=your_file_token
STORAGE_BACKEND=rclone
RCLONE_REMOTE=gdrive
RCLONE_PATH=backups/notion
KEEP_LOCAL_BACKUP=false
ENABLE_NOTIFICATIONS=true
APPRISE_URLS=discord://webhook_id/webhook_token
NOTIFICATION_LEVEL=all
# .env
NOTION_SPACE_ID=your_space_id
NOTION_TOKEN_V2=your_token
NOTION_FILE_TOKEN=your_file_token
STORAGE_BACKEND=rclone
RCLONE_REMOTE=s3backup
RCLONE_PATH=company-notion-backups
RCLONE_ADDITIONAL_ARGS=--transfers=8,--checkers=16
ENABLE_NOTIFICATIONS=true
APPRISE_URLS=slack://TokenA/TokenB/TokenC/general,mailto://backup@company.com?to=it@company.com
NOTIFICATION_LEVEL=all
MAX_RETRIES=5
RETRY_DELAY=10
# Daily backup at 2 AM
0 2 * * * cd /path/to/notion-backup && python main.py backup
# Weekly cleanup (keep 30 backups)
0 3 * * 0 cd /path/to/notion-backup && python main.py cleanup --keep 30
Create a task that runs:
Program: python
Arguments: /path/to/notion-backup/main.py backup
Start in: /path/to/notion-backup
- Ensure all required environment variables are set
- Check
.env
file syntax
- For local: Check directory permissions
- For rclone: Test rclone config with
rclone lsd remote:
- Token may have expired - get new tokens from browser
- Check network connectivity
- Verify space ID is correct
- Test notification URLs independently
- Check URL format in Apprise documentation
Enable debug logging for detailed troubleshooting:
python main.py --debug backup
# Test all connections
python main.py test
# Test specific components
python -c "from src.notion_backup.config import Settings; s=Settings(); print('Config loaded successfully')"
- Store
.env
file securely, never commit to version control NOTION_TOKEN_V2
provides full workspace access- Tokens expire after ~1 year or on logout
- Use environment variables in production instead of
.env
files - Regularly rotate tokens and review access
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request