- Python 100%
| tests | ||
| .env.example | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| api.py | ||
| cli.py | ||
| LICENSE | ||
| login.py | ||
| pyproject.toml | ||
| README.md | ||
| supersync.py | ||
| sync_state.py | ||
| uv.lock | ||
SuperSync
A Python utility for interacting with the Supernote Cloud API, including authentication, file synchronization, upload, download, listing, deletion, and two-way sync capabilities.
Features
- Authentication: Login to Supernote Cloud with email/password and optional email verification
- Token Management: JWT token validation and persistent storage
- API Access: Query user info, storage capacity, and file listings
- File Download: Download files from Supernote Cloud to local directory
- File Upload: Upload local files to Supernote Cloud
- Folder Management: Create folders on Supernote Cloud
- File Listing: Recursively list files and directories with tree view and optional metadata
- File Deletion: Delete files or directories from Supernote Cloud with safety confirmations
- One-Way Sync: Sync files from Supernote Cloud to local directory (
sync_down) - Two-Way Sync: Bidirectional sync with conflict handling and state tracking (
sync) - Smart Transfer: Only uploads/downloads files when needed (checks MD5 and modification time)
- Conflict Resolution: Handles conflicts by keeping both versions when both local and remote are modified
- State Tracking: SQLite-based sync state distinguishes new files from deleted files
- Deletion Handling: Safely handles remote deletions by moving local files to trash
- Concurrent Sync Protection: File locking prevents multiple sync operations from running simultaneously
- Sync History: Comprehensive logging of all sync operations for debugging and audit trails
- Sync Status: View sync state, tracked files, and statistics for any sync directory
- Trash Management: List, empty, and restore files from the local trash directory
Installation
Prerequisites
- Python 3.10 or higher
uv(recommended) orpipfor package management
Setup
-
Clone or download this repository
-
Create a virtual environment (recommended):
uv venv # or python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate -
Install dependencies:
uv pip install -e . # or pip install -e . -
Copy
.env.exampleto.envand configure:cp .env.example .env # Edit .env with your values
Configuration
Create a .env file in the project root with the following variables:
Required Variables
SUPERNOTE_API_BASE: API base URL (default:https://cloud.supernote.com/api)SUPERNOTE_TOKEN: Access token obtained from login (see below)SUPERNOTE_SYNC_DIR: Local directory path for syncing files
Optional Variables
SUPERNOTE_USERNAME: Your Supernote Cloud email (avoids prompting)SUPERNOTE_PASSWORD: Your Supernote Cloud password (avoids prompting)SUPERNOTE_COUNTRY_CODE: Country code for login (default:1)SUPERNOTE_DEBUG: Set to1to enable verbose debug output
Usage
1. Login and Obtain Token
First, authenticate and obtain an access token:
python login.py
This will:
- Check if you already have a valid token
- Prompt for credentials if needed (or use values from
.env) - Handle email verification if required
- Save the token to
.envfor future use
Note: The token is stored in .env and will be reused automatically if still valid.
2. Test API Connection
Test that your token works and view your account info:
python supersync.py
This will:
- Validate your token
- Display your user information
- Show storage capacity
- List files and folders in your cloud storage
3. Command-Line Interface
The cli.py script provides a convenient command-line interface for common operations. After installing the package, you can use it directly:
python cli.py <command> [options]
Or, if installed as a package, use the supersync command:
supersync <command> [options]
Download Command
Download files from Supernote Cloud to a local directory (one-way sync: cloud → local):
# Download entire cloud to default directory (SUPERNOTE_SYNC_DIR)
python cli.py download
# Download entire cloud to custom directory
python cli.py download --directory ./my-notes
# Download specific remote path to custom directory
python cli.py download --path "Document/Notes" --directory ./notes
# Short form
python cli.py download -p "Document/Work" -d ./work
Options:
-d, --directory: Local directory to download files to (default:SUPERNOTE_SYNC_DIRenv var)-p, --path: Remote path to download from (e.g.,"Document/Notes"). If not specified, downloads entire cloud.
Note: The remote path must exist. If it doesn't exist, the command will fail with an error.
Upload Command
Upload files from a local directory to Supernote Cloud (one-way: local → cloud):
# Upload local directory to remote path (creates directory if needed)
python cli.py upload --path "Document/Work" --directory ./work
# Upload to cloud root
python cli.py upload --directory ./my-files
# Upload with backup enabled
python cli.py upload --path "Document/Notes" --directory ./notes --backup
Options:
-d, --directory: Local directory to upload files from (default:SUPERNOTE_SYNC_DIRenv var)-p, --path: Remote path to upload to (e.g.,"Document/Notes"). If not specified, uploads to cloud root. Remote directories will be created if they don't exist.--backup: Backup remote files before overwriting with local versions
When to use upload vs sync:
- Use
uploadwhen you have many local files that don't exist remotely and want to safely upload them without any risk of files being deleted or moved to trash. - Use
syncwhen you want bidirectional synchronization and are okay with handling deletions (files deleted on remote will be moved to local trash).
Important: The upload command never deletes or moves files to trash, making it safer for initial uploads of new content.
Sync Command
Two-way sync between Supernote Cloud and local directory (bidirectional sync):
# Sync entire cloud with default directory
python cli.py sync
# Sync entire cloud with custom directory
python cli.py sync --directory ./my-notes
# Sync specific remote path with local directory
python cli.py sync --path "Document/Work" --directory ./work
# Sync with backup enabled (backs up remote files before overwriting)
python cli.py sync --path "Document/Work" --directory ./work --backup
# Sync without handling deletions (don't move deleted files to trash)
python cli.py sync --path "Document/Work" --directory ./work --no-handle-deletions
Options:
-d, --directory: Local directory to sync with (default:SUPERNOTE_SYNC_DIRenv var)-p, --path: Remote path to sync (e.g.,"Document/Notes"). If not specified, syncs entire cloud. Remote directories will be created if they don't exist.--backup: Backup remote files before overwriting with local versions--no-handle-deletions: Don't move locally deleted files to trash when they're deleted on remote
Sync Behavior:
- Upload functionality: Yes! The
synccommand can work as an upload function. If you specify a remote path that doesn't exist (e.g.,"Document/Work"), it will:- Create the remote directory path automatically
- Upload all files from your local directory to that remote path
- Continue syncing bidirectionally for future runs
State-Aware Sync: The sync function uses a SQLite database (.supersync_state.db) to track synchronization history. This allows it to intelligently distinguish between:
- New local files (never synced before) → uploaded to remote
- Remotely deleted files (previously synced but deleted on remote) → local copy deleted (not re-uploaded)
This solves the critical problem where sync couldn't tell the difference between a new file and a file that was deleted remotely.
Example: If you have a local ./work directory and run:
python cli.py sync --path "Document/Work" --directory ./work
when Document/Work doesn't exist on the remote, it will:
- Create the
Document/Workdirectory structure on Supernote Cloud - Upload all files from
./worktoDocument/Work - Set up bidirectional sync for future changes
Sync Strategy:
The sync function processes files based on their state and current existence:
-
Files that exist both locally and remotely:
- If remote is newer → download from remote
- If local is newer → upload to remote
- If both modified → keep both versions (conflict resolution)
- If MD5 matches → skip (already in sync)
- Updates sync state after operation
-
Files that exist only locally:
- New files (not in sync state) → upload to remote, add to state
- Deleted remotely (in sync state but not on remote) → delete local copy, mark as deleted in state (move to trash if
handle_deletions=True)
-
Files that exist only remotely:
- New remote files (not in sync state) → download to local, add to state
- Deleted locally (in sync state but not locally) → delete from remote, mark as deleted in state
-
Conflicts (both modified):
- Keep both versions (newer keeps original name, older gets
_conflict_<timestamp>suffix) - Both versions are preserved
- Keep both versions (newer keeps original name, older gets
-
Cleanup:
- Orphaned state entries (files in state but no longer exist locally or remotely) are removed
- Old tombstones (deleted files older than 30 days) are cleaned up
- Note: Sync history is kept forever by default. Use
supersync history --cleanupto manually clean old history.
Sync State Tracking:
The sync function uses a SQLite database (.supersync_state.db) in the sync root directory to track file synchronization state. This is the core feature that enables intelligent sync behavior.
What it tracks:
- File paths (relative to sync root)
- Remote file IDs
- Remote modification times
- Last sync action (upload/download/sync)
- MD5 hashes (for verification)
- Deletion timestamps (tombstones)
- Local deletion flags (for files deleted locally but still on remote)
How it works:
- First sync: No state exists → all files are treated as new, state is built
- Subsequent syncs: State is used to determine file history
- State updates: After each file operation (upload/download), state is updated
- Cleanup: Orphaned entries and old tombstones are automatically cleaned up
Key benefits:
- Distinguishes new vs deleted: Knows if a local-only file is new (upload) or was deleted remotely (delete local)
- Prevents re-uploading deleted files: Files deleted on remote won't be re-uploaded
- Tracks sync history: Maintains a record of what was synced when
- Comprehensive logging: All sync operations are logged to
sync_historytable for debugging - Concurrent protection: File locking prevents multiple syncs from running simultaneously
- Automatic cleanup: Old state entries and tombstones are cleaned up automatically (history kept forever by default)
State file location: .supersync_state.db in your sync root directory
Resetting Sync State:
If you need to reset sync state (e.g., after major changes or corruption), simply delete the .supersync_state.db file in your sync directory. The next sync will rebuild the state from scratch, treating all files as new.
List Command
Recursively list files and directories from Supernote Cloud in a tree format:
# List root directory
python cli.py list
# List specific remote path
python cli.py list "Note/Thesis"
# List with depth limit (only show first 2 levels)
python cli.py list "Document" --depth 2
# List with metadata (size, modification time, MD5 hash)
python cli.py list "Document" --metadata
# Combine options
python cli.py list "Note/Thesis" --depth 3 --metadata
Options:
path(positional, optional): Remote path to list (e.g.,"Note/Thesis"). If not specified, lists root directory.-d, --depth: Maximum depth to recurse (default: unlimited)-m, --metadata: Show metadata (size, modification time, MD5 hash) for files
Features:
- Displays files and folders in a tree structure with
├──and└──characters - Folders are shown first, then files, both sorted alphabetically
- Metadata display includes file size (human-readable), modification time, and MD5 hash (first 8 characters)
- Automatically handles empty directories and errors gracefully
Delete Command
Delete a file or directory from Supernote Cloud:
# Delete a file (with confirmation prompt)
python cli.py delete "Note/Thesis/file.note"
# Delete a directory (with confirmation prompt and contents listing)
python cli.py delete "Document/Notes"
# Delete without confirmation prompt
python cli.py delete "Document/Notes" --force
Options:
path(positional, required): Remote path to delete (e.g.,"Note/Thesis/file.note"or"Document/Notes")-f, --force: Skip confirmation prompt
Features:
- Safety first: Asks for confirmation by default (default: N) unless
--forceis used - Directory preview: For directories, lists all contents recursively before deletion so you can see what will be deleted
- Path validation: Verifies the path exists before attempting deletion
- Error handling: Provides clear error messages if the path is not found or deletion fails
Warning: Deletions are permanent on Supernote Cloud. Use with caution, especially with the --force flag.
History Command
View sync operation history to debug issues and understand what happened during past sync operations:
# View recent sync history (last 100 entries)
python cli.py history
# View history for a specific file
python cli.py history --file "Document/Notes/file.note"
# View only errors from last 7 days
python cli.py history --status error --since 7d
# View only uploads
python cli.py history --action upload
# View sync session summaries
python cli.py history --sessions
# View detailed history with metadata
python cli.py history --verbose
# Clean up history older than 90 days
python cli.py history --cleanup 90d
Options:
-d, --directory: Local sync directory (default:SUPERNOTE_SYNC_DIRenv var)-f, --file: Filter by file path (exact match)-a, --action: Filter by action type (upload,download,skip,conflict,delete,delete_remote)-s, --status: Filter by status (success,error,skipped)--session: Filter by sync session ID--since: Only show entries since this time (e.g.,7dfor 7 days,2hfor 2 hours, or Unix timestamp)-l, --limit: Maximum number of entries to show (default: 100)--sessions: Show sync session summaries instead of detailed history-v, --verbose: Show additional details (remote ID, MD5, timestamps)--cleanup: Clean up history older than specified age (e.g.,90dfor 90 days). By default, history is kept forever.
Features:
- Comprehensive logging: All sync operations (uploads, downloads, skips, conflicts, deletions, errors) are logged
- Session tracking: Each sync operation gets a unique session ID for grouping related events
- Filtering: Filter by file, action, status, session, or time range
- Permanent retention: History is kept forever by default (no automatic cleanup)
- Manual cleanup: Use
--cleanupto remove old history entries when needed - Color-coded output: Success (green) and error (red) status indicators
- Debugging aid: Error messages are preserved for troubleshooting
History Data:
The history tracks:
- Timestamp of each operation
- File path (relative to sync root)
- Action taken (upload, download, skip, conflict, delete, etc.)
- Status (success, error, skipped)
- Error messages (if any)
- Remote file ID and update time
- MD5 hash (when available)
Use Cases:
- Debugging sync issues: See exactly what happened during a sync operation
- Audit trail: Track all file changes over time
- Error analysis: Identify patterns in sync failures
- Performance analysis: Understand sync behavior and timing
Status Command
View sync status and statistics for a local sync folder:
# View sync status summary
python cli.py status
# View status with untracked files list
python cli.py status --show-untracked
# View status with missing files list
python cli.py status --show-missing
# View details for a specific file
python cli.py status --file "Document/Notes/file.note"
# Limit number of files shown in lists
python cli.py status --show-untracked --limit 50
Options:
-d, --directory: Local sync directory (default:SUPERNOTE_SYNC_DIRenv var)-f, --file: Show details for a specific file--show-untracked: Show list of untracked files (exist locally but not in sync state)--show-missing: Show list of missing files (tracked but not found locally)-l, --limit: Maximum number of files to show in lists (default: 20)
Status Information:
The status command displays:
- Statistics:
- Number of tracked files
- Number of untracked files
- Number of missing files (tracked but not found locally)
- Number of tombstones (deleted file markers)
- Total history entries
- Last sync time
- File details (when using
--file):- Remote file ID
- Last sync action
- MD5 hash
- Remote update time
- Last sync time
- Local deletion status
Use Cases:
- Quick health check: See if sync state is consistent
- Find untracked files: Identify files that haven't been synced yet
- Find missing files: Identify files that are tracked but missing locally
- Debug sync state: Understand what the sync system knows about your files
Trash Commands
Manage files in the local trash directory (.trash):
# List files in trash
python cli.py trash list
# List with limit
python cli.py trash list --limit 50
# Empty trash (with confirmation)
python cli.py trash empty
# Empty trash without confirmation
python cli.py trash empty --force
# Restore a file from trash
python cli.py trash restore file.note
# Restore to specific location
python cli.py trash restore file.note --to "Document/Notes/file.note"
# Restore and overwrite existing file
python cli.py trash restore file.note --to "Document/Notes/file.note" --force
Trash List Command:
Lists all files in the .trash directory with:
- File modification time
- File size (human-readable)
- Total count and size of all trash files
Options:
-d, --directory: Local sync directory (default:SUPERNOTE_SYNC_DIRenv var)-l, --limit: Maximum number of files to show (default: 100)
Trash Empty Command:
Permanently deletes all files in the trash directory.
Options:
-d, --directory: Local sync directory (default:SUPERNOTE_SYNC_DIRenv var)-f, --force: Skip confirmation prompt
Warning: This permanently deletes files. Use with caution.
Trash Restore Command:
Restores a file from trash to the sync directory.
Options:
file(positional, required): File name in trash to restore-d, --directory: Local sync directory (default:SUPERNOTE_SYNC_DIRenv var)--to: Destination path relative to sync root (default: restore to root with original name)-f, --force: Overwrite existing file if it exists
Note: Files in trash are stored with flattened paths (slashes replaced with underscores). If you don't specify --to, the file will be restored to the sync root with its flattened name. To restore to the original location, use --to to specify the desired path.
Use Cases:
- Recover deleted files: Restore files that were moved to trash during sync
- Clean up disk space: Empty trash to free up storage
- Audit deletions: See what files have been deleted and when
Getting Help
View help for any command:
python cli.py --help
python cli.py download --help
python cli.py upload --help
python cli.py sync --help
python cli.py list --help
python cli.py delete --help
python cli.py history --help
python cli.py status --help
python cli.py trash --help
4. Using the Python API
Basic File Operations
from pathlib import Path
from supersync import (
get_file_list,
upload_file,
get_download_url,
_download_file,
create_folder,
delete_files,
set_access_token_cookie,
)
from api import existing_token_is_valid, SUPERNOTE_TOKEN
# Ensure token is valid and cookie is set
if not existing_token_is_valid(SUPERNOTE_TOKEN):
raise SystemExit("Token invalid - run login.py first")
set_access_token_cookie(API_SESSION, SUPERNOTE_TOKEN)
# List files in root directory
root_items = get_file_list(0)
print(root_items)
# List files in a specific directory (by ID)
folder_id = 123456789
folder_items = get_file_list(folder_id)
# Create a folder
new_folder = create_folder(parent_directory_id=0, folder_name="MyFolder")
# Upload a file
local_file = Path("/path/to/local/file.pdf")
upload_result = upload_file(local_file, directory_id=0)
print(f"Upload result: {upload_result}")
# Download a file
file_id = 987654321
download_info = get_download_url(file_id)
url = download_info.get("url")
if url:
_download_file(url, Path("/path/to/save/file.pdf"), update_time_ms=0)
# Delete files (use with caution!)
delete_files(directory_id=0, file_ids=[file_id])
One-Way Sync (Cloud → Local)
Download files from Supernote Cloud to your local directory:
from supersync import sync_down
from pathlib import Path
# Sync entire cloud to local directory
sync_down(
local_root=Path("/path/to/local/sync"),
remote_directory_id=0, # 0 = root directory
)
# Sync only specific paths (e.g., only "Note" folder)
sync_down(
local_root=Path("/path/to/local/sync"),
remote_directory_id=0,
include_prefixes=["Note"], # Only sync Note folder and subfolders
)
# Sync multiple specific paths
sync_down(
local_root=Path("/path/to/local/sync"),
remote_directory_id=0,
include_prefixes=["Note/Work", "Document/Important"],
)
Features:
- Only downloads files that don't exist locally or are newer on remote
- Compares MD5 hashes and modification times
- Creates local directory structure automatically
- Never modifies or deletes remote files
Two-Way Sync (Bidirectional)
Sync files bidirectionally with conflict handling:
from supersync import sync
from pathlib import Path
# Basic two-way sync
sync(
local_root=Path("/path/to/local/sync"),
remote_directory_id=0,
)
# With options
sync(
local_root=Path("/path/to/local/sync"),
remote_directory_id=0,
include_prefixes=["Note"], # Only sync specific paths
backup_remote=True, # Backup remote files before overwriting with local
handle_deletions=True, # Move locally-deleted files to trash
trash_dir=Path("/path/to/trash"), # Custom trash directory
)
Sync Strategy (State-Aware):
The sync function uses state tracking to intelligently handle all file scenarios:
-
Files that exist both locally and remotely:
- If remote is newer → download from remote
- If local is newer → upload to remote
- If both modified → keep both versions (conflict resolution)
- If MD5 matches → skip (already in sync)
- Updates sync state after operation
-
Files that exist only locally:
- New files (not in sync state) → upload to remote, add to state
- Deleted remotely (in sync state but not on remote) → delete local copy, mark as deleted (move to trash if
handle_deletions=True)
-
Files that exist only remotely:
- New remote files (not in sync state) → download to local, add to state
- Deleted locally (in sync state but not locally) → delete from remote, mark as deleted in state
-
Cleanup:
- Orphaned state entries are removed
- Old tombstones (deleted files older than 30 days) are cleaned up
- Note: Sync history is kept forever by default. Use
supersync history --cleanupto manually clean old history.
Conflict Resolution: When both local and remote files have been modified:
- The newer version keeps the original filename
- The older version is saved with a
_conflict_<timestamp>suffix - Both versions are preserved
Syncing a Specific Subdirectory
To sync a specific remote subdirectory (e.g., Document/Thesis/technical-manuals) with a local folder:
from pathlib import Path
from supersync import find_remote_directory_by_path, ensure_remote_directory_by_path, sync_down
# Option 1: Find existing directory
remote_folder_id = find_remote_directory_by_path("Document/Thesis/technical-manuals")
if remote_folder_id:
sync_down(
local_root=Path("./local-pdfs"),
remote_directory_id=remote_folder_id,
)
else:
print("Folder not found - create it first!")
# Option 2: Ensure directory exists (creates if missing)
remote_folder_id = ensure_remote_directory_by_path("Document/Thesis/technical-manuals")
sync_down(
local_root=Path("./local-pdfs"),
remote_directory_id=remote_folder_id,
)
Perfect for automated workflows: Your other project can generate PDFs locally and sync them to a specific remote folder:
from pathlib import Path
from supersync import ensure_remote_directory_by_path, sync_down
# Your project generates PDFs in ./generated-pdfs/
# Sync them to Document/Thesis/technical-manuals on Supernote Cloud
remote_folder_id = ensure_remote_directory_by_path("Document/Thesis/technical-manuals")
sync_down(
local_root=Path("./generated-pdfs"),
remote_directory_id=remote_folder_id,
)
Individual File Sync
Sync a single file intelligently:
from supersync import sync_file_up
from pathlib import Path
# Upload a file if local is newer
sync_file_up(
local_path=Path("/path/to/local/file.pdf"),
directory_id=0,
remote_filename="file.pdf", # Optional: different remote name
backup_remote=True, # Backup remote before overwriting
)
Project Structure
supersync/
├── api.py # Common API functions (CSRF handling, POST requests, cookie management)
├── login.py # Authentication and token management
├── supersync.py # Main sync script with upload/download/sync functionality
├── sync_state.py # Sync state tracking using SQLite (state management, file locking)
├── cli.py # Command-line interface (download, upload, sync commands)
├── tests/ # Test suite
│ ├── test_sync.py # One-way sync tests
│ ├── test_two_way_sync.py # Two-way sync and state tracking tests
│ ├── test_upload_download.py # Upload/download operation tests
│ ├── test_cli.py # CLI command tests
│ ├── test_sync_state.py # Sync state management tests
│ └── conftest.py # Test fixtures and configuration
├── .env.example # Example environment configuration
├── pyproject.toml # Project dependencies and metadata
└── README.md # This file
API Functions Reference
File Operations
-
get_file_list(directory_id: int = 0, page_size: int = 100) -> Dict[str, Any]- List files and folders in a directory
-
get_download_url(file_id: int, link_type: int = 0) -> Dict[str, Any]- Get a temporary download URL for a file
-
upload_file(local_path: Path, directory_id: int, remote_filename: Optional[str] = None, overwrite: bool = False) -> Dict[str, Any]- Upload a local file to Supernote Cloud
- Automatically skips if file with same MD5 already exists
-
create_folder(parent_directory_id: int, folder_name: str) -> Dict[str, Any]- Create a new folder in Supernote Cloud
-
delete_files(directory_id: int, file_ids: Sequence[int]) -> Dict[str, Any]- Delete files from Supernote Cloud (use with extreme caution!)
Sync Operations
-
sync_down(local_root: Optional[Path] = None, remote_directory_id: int = 0, include_prefixes: Optional[Sequence[str]] = None) -> None- One-way sync: Cloud → Local
-
sync(local_root: Optional[Path] = None, remote_directory_id: int = 0, include_prefixes: Optional[Sequence[str]] = None, backup_remote: bool = False, handle_deletions: bool = True, trash_dir: Optional[Path] = None) -> None- Two-way sync with conflict handling and state tracking
- Uses SQLite database (
.supersync_state.db) to track sync history - Prevents concurrent sync operations via file locking
- Distinguishes between new files and remotely deleted files
-
sync_file_up(local_path: Path, directory_id: int, remote_filename: Optional[str] = None, backup_remote: bool = False) -> None- Upload a single file if local is newer
Helper Functions
-
find_remote_directory_by_path(path: str, start_directory_id: int = 0) -> Optional[int]- Find a remote directory by its path (e.g., "Document/Thesis/technical-manuals")
- Returns directory ID if found, None otherwise
-
ensure_remote_directory_by_path(path: str, start_directory_id: int = 0) -> int- Find or create a remote directory by its path
- Creates any missing parent directories automatically
- Returns directory ID (always succeeds if creation works)
-
_find_remote_file_by_name(directory_id: int, filename: str) -> Optional[Dict[str, Any]]- Find a file in a remote directory by name
-
_find_remote_folder_by_name(directory_id: int, folder_name: str) -> Optional[Dict[str, Any]]- Find a folder in a remote directory by name
Smart Transfer Logic
SuperSync uses intelligent comparison and state tracking to avoid unnecessary transfers:
- MD5 Hash Comparison: Files with identical MD5 hashes are considered identical and skipped
- Modification Time Comparison: Files are only transferred if one version is newer than the other
- State Tracking: Sync state database tracks file history to distinguish new files from deleted files
- Conflict Detection: When both local and remote have been modified, both versions are preserved
This means:
- Re-running sync operations is fast (only changed files are transferred)
- No unnecessary bandwidth usage
- New files are correctly identified and uploaded (not deleted)
- Remotely deleted files are correctly identified and removed locally (not re-uploaded)
- Safe conflict resolution
Safety Features
- No Remote Deletions by Default: Sync operations never delete files on the remote unless explicitly using
delete_files() - Trash Directory: Files deleted on remote are moved to a local trash directory (not permanently deleted)
- Backup Before Overwrite: Optional backup of remote files before overwriting with local versions
- Conflict Preservation: Both versions preserved when conflicts occur
- State Tracking: Sync state database prevents accidental deletion of new files and re-uploading of deleted files
- Concurrent Sync Protection: File locking prevents multiple sync operations from running simultaneously (prevents state corruption)
- Token Validation: Automatic token validation before operations
- Sync History: Comprehensive logging of all sync operations for debugging and audit trails (kept forever by default)
- Trash Recovery: Files moved to trash can be restored using
trash restorecommand
Examples
Example 1: Download All Notes
from supersync import sync_down
from pathlib import Path
sync_down(
local_root=Path("~/Documents/Supernote"),
include_prefixes=["Note"],
)
Example 2: Two-Way Sync with Backup
from supersync import sync
from pathlib import Path
sync(
local_root=Path("~/Documents/Supernote"),
include_prefixes=["Note/Work"],
backup_remote=True,
handle_deletions=True,
trash_dir=Path("~/Documents/Supernote/.trash"),
)
Example 3: Upload a Single File
from supersync import upload_file, get_file_list
from pathlib import Path
# Find Document folder
root = get_file_list(0)
doc_folder = next(i for i in root["userFileVOList"] if i["fileName"] == "Document" and i["isFolder"] == "Y")
doc_id = int(doc_folder["id"])
# Upload file
upload_file(Path("local_file.pdf"), doc_id)
Troubleshooting
CSRF Token Errors
If you encounter CSRF token validation errors:
- Ensure your token is valid:
python login.pywill check and refresh if needed - Check that
SUPERNOTE_API_BASEis correct - Try enabling debug mode:
SUPERNOTE_DEBUG=1 python supersync.py
Token Expired
If your token expires:
- Run
python login.pyagain to get a new token - The script will automatically save it to
.env
Upload/Download Failures
- Check that you have sufficient storage capacity
- Verify file permissions on local filesystem
- Ensure network connection is stable
- Check that file size is within limits
Sync Issues
- Files not syncing as expected: Check the sync state database (
.supersync_state.db) to see what the sync function knows about your files - New files being deleted: This shouldn't happen anymore with state tracking. If it does, check that the state database exists and is valid
- Files re-uploading after remote deletion: This is prevented by state tracking. If it happens, the file may not be in state - check the state database
- Concurrent sync errors: Only one sync operation can run at a time. Wait for the current sync to finish before starting another
- State database corruption: If the state database becomes corrupted, delete
.supersync_state.dband the next sync will rebuild it - If conflicts occur: Check the conflict files in your local directory (files with
_conflict_<timestamp>suffix) - If files are moved to trash unexpectedly: Check if they were deleted on remote - this is expected behavior with
handle_deletions=True. Usesupersync trash listto see what's in trash andsupersync trash restoreto recover files. - Debugging sync issues: Use
supersync historyto see what happened during sync operations. Filter by file, action, or status to narrow down issues. - Checking sync state: Use
supersync statusto see sync statistics and identify untracked or missing files. - History taking up space: Use
supersync history --cleanup 90dto remove history older than 90 days (history is kept forever by default).
Security Notes
- Never commit
.env- It contains sensitive credentials - Tokens are stored in plain text in
.env- protect this file - Passwords are stored in
.envif you choose to save them (not recommended) - Be careful with
delete_files()- deletions are permanent on the remote
Testing
Run the test suite using pytest:
# Install test dependencies
uv sync --extra test
# or
pip install -e ".[test]"
# Run all tests
pytest
# Run specific test file
pytest tests/test_sync.py
# Run with verbose output
pytest -v
# Run specific test
pytest tests/test_sync.py::test_sync_down_downloads_files
The test suite includes:
- Basic upload/download functionality
- Duplicate detection (MD5/mtime checks)
- Nested directories
- One-way sync (cloud → local)
- Two-way sync with conflict handling
- Sync state tracking (new files vs deleted files)
- Concurrent sync protection
- Deletion handling
- State persistence across sync runs
Note: Tests require a valid SUPERNOTE_TOKEN in your .env file and will use Document/supersync as a test playground folder. All tests use UUID-based unique filenames to avoid conflicts when running tests in parallel or after interruptions.
License
MIT License - see LICENSE file for details
Contributing
Contributions welcome! Please ensure code follows the existing style and includes appropriate tests.
Acknowledgments
- Inspired by the sncloud project for API endpoint discovery
- Uses the Supernote Cloud API (unofficial, reverse-engineered)