Skip to content

Rclone

Before any rclone operation, verify installation and configuration:

Terminal window
# Check if rclone is installed
command -v rclone >/dev/null 2>&1 && echo "rclone installed: $(rclone version | head -1)" || echo "NOT INSTALLED"
# List configured remotes
rclone listremotes 2>/dev/null || echo "NO REMOTES CONFIGURED"

Guide the user to install:

Terminal window
# macOS
brew install rclone
# Linux (script install)
curl https://rclone.org/install.sh | sudo bash
# Or via package manager
sudo apt install rclone # Debian/Ubuntu
sudo dnf install rclone # Fedora

Walk the user through interactive configuration:

Terminal window
rclone config

Common provider setup quick reference:

ProviderTypeKey Settings
AWS S3s3access_key_id, secret_access_key, region
Cloudflare R2s3access_key_id, secret_access_key, endpoint (account_id.r2.cloudflarestorage.com)
Backblaze B2b2account (keyID), key (applicationKey)
DigitalOcean Spacess3access_key_id, secret_access_key, endpoint (region.digitaloceanspaces.com)
Google DrivedriveOAuth flow (opens browser)
DropboxdropboxOAuth flow (opens browser)

Example: Configure Cloudflare R2

Terminal window
rclone config create r2 s3 \
provider=Cloudflare \
access_key_id=YOUR_ACCESS_KEY \
secret_access_key=YOUR_SECRET_KEY \
endpoint=ACCOUNT_ID.r2.cloudflarestorage.com \
acl=private

Example: Configure AWS S3

Terminal window
rclone config create aws s3 \
provider=AWS \
access_key_id=YOUR_ACCESS_KEY \
secret_access_key=YOUR_SECRET_KEY \
region=us-east-1
Terminal window
rclone copy /path/to/file.mp4 remote:bucket/path/ --progress
Terminal window
rclone copy /path/to/folder remote:bucket/folder/ --progress

Sync directory (mirror, deletes removed files)

Section titled “Sync directory (mirror, deletes removed files)”
Terminal window
rclone sync /local/path remote:bucket/path/ --progress
Terminal window
rclone ls remote:bucket/
rclone lsd remote:bucket/ # directories only
Terminal window
rclone copy /path remote:bucket/ --dry-run
FlagPurpose
--progressShow transfer progress
--dry-runPreview without transferring
-vVerbose output
--transfers=NParallel transfers (default 4)
--bwlimit=RATEBandwidth limit (e.g., 10M)
--checksumCompare by checksum, not size/time
--exclude="*.tmp"Exclude patterns
--include="*.mp4"Include only matching
--min-size=SIZESkip files smaller than SIZE
--max-size=SIZESkip files larger than SIZE

For videos and large files, use chunked uploads:

Terminal window
# S3 multipart upload (automatic for >200MB)
rclone copy large_video.mp4 remote:bucket/ --s3-chunk-size=64M --progress
# Resume interrupted transfers
rclone copy /path remote:bucket/ --progress --retries=5
Terminal window
# Check file exists and matches
rclone check /local/file remote:bucket/file
# Get file info
rclone lsl remote:bucket/path/to/file
Terminal window
# Test connection
rclone lsd remote:
# Debug connection issues
rclone lsd remote: -vv
# Check config
rclone config show remote