Rclone
rclone File Transfer Skill
Section titled “rclone File Transfer Skill”Setup Check (Always Run First)
Section titled “Setup Check (Always Run First)”Before any rclone operation, verify installation and configuration:
# Check if rclone is installedcommand -v rclone >/dev/null 2>&1 && echo "rclone installed: $(rclone version | head -1)" || echo "NOT INSTALLED"
# List configured remotesrclone listremotes 2>/dev/null || echo "NO REMOTES CONFIGURED"If rclone is NOT installed
Section titled “If rclone is NOT installed”Guide the user to install:
# macOSbrew install rclone
# Linux (script install)curl https://rclone.org/install.sh | sudo bash
# Or via package managersudo apt install rclone # Debian/Ubuntusudo dnf install rclone # FedoraIf NO remotes are configured
Section titled “If NO remotes are configured”Walk the user through interactive configuration:
rclone configCommon provider setup quick reference:
| Provider | Type | Key Settings |
|---|---|---|
| AWS S3 | s3 | access_key_id, secret_access_key, region |
| Cloudflare R2 | s3 | access_key_id, secret_access_key, endpoint (account_id.r2.cloudflarestorage.com) |
| Backblaze B2 | b2 | account (keyID), key (applicationKey) |
| DigitalOcean Spaces | s3 | access_key_id, secret_access_key, endpoint (region.digitaloceanspaces.com) |
| Google Drive | drive | OAuth flow (opens browser) |
| Dropbox | dropbox | OAuth flow (opens browser) |
Example: Configure Cloudflare R2
rclone config create r2 s3 \ provider=Cloudflare \ access_key_id=YOUR_ACCESS_KEY \ secret_access_key=YOUR_SECRET_KEY \ endpoint=ACCOUNT_ID.r2.cloudflarestorage.com \ acl=privateExample: Configure AWS S3
rclone config create aws s3 \ provider=AWS \ access_key_id=YOUR_ACCESS_KEY \ secret_access_key=YOUR_SECRET_KEY \ region=us-east-1Common Operations
Section titled “Common Operations”Upload single file
Section titled “Upload single file”rclone copy /path/to/file.mp4 remote:bucket/path/ --progressUpload directory
Section titled “Upload directory”rclone copy /path/to/folder remote:bucket/folder/ --progressSync directory (mirror, deletes removed files)
Section titled “Sync directory (mirror, deletes removed files)”rclone sync /local/path remote:bucket/path/ --progressList remote contents
Section titled “List remote contents”rclone ls remote:bucket/rclone lsd remote:bucket/ # directories onlyCheck what would be transferred (dry run)
Section titled “Check what would be transferred (dry run)”rclone copy /path remote:bucket/ --dry-runUseful Flags
Section titled “Useful Flags”| Flag | Purpose |
|---|---|
--progress | Show transfer progress |
--dry-run | Preview without transferring |
-v | Verbose output |
--transfers=N | Parallel transfers (default 4) |
--bwlimit=RATE | Bandwidth limit (e.g., 10M) |
--checksum | Compare by checksum, not size/time |
--exclude="*.tmp" | Exclude patterns |
--include="*.mp4" | Include only matching |
--min-size=SIZE | Skip files smaller than SIZE |
--max-size=SIZE | Skip files larger than SIZE |
Large File Uploads
Section titled “Large File Uploads”For videos and large files, use chunked uploads:
# S3 multipart upload (automatic for >200MB)rclone copy large_video.mp4 remote:bucket/ --s3-chunk-size=64M --progress
# Resume interrupted transfersrclone copy /path remote:bucket/ --progress --retries=5Verify Upload
Section titled “Verify Upload”# Check file exists and matchesrclone check /local/file remote:bucket/file
# Get file inforclone lsl remote:bucket/path/to/fileTroubleshooting
Section titled “Troubleshooting”# Test connectionrclone lsd remote:
# Debug connection issuesrclone lsd remote: -vv
# Check configrclone config show remote