Commands
Copy
Guide to using rclone copy command - the safe way to backup files
rclone copy
The safest rclone command, it only adds files (never deletes)
| Scenario | Use copy | Use sync | Use move |
|---|---|---|---|
| Regular backups | ✅ | ⚠️ Deletes extra files | ❌ |
| First-time transfer | ✅ | ✅ | ⚠️ Removes source |
| Keep source files | ✅ | ✅ | ❌ |
| Mirror directories | ❌ | ✅ | ❌ |
| Free up space | ❌ | ❌ | ✅ |
Quick start
# Basic syntax
rclone copy SOURCE DESTINATION
# Local to cloud
rclone copy /home/photos gdrive:photos-backup
# Cloud to local
rclone copy dropbox:documents /home/backup/
# Cloud to cloud
rclone copy gdrive:important s3:backup/Usage
Filtering Files
By Size
# Only large files
rclone copy /source remote:dest --min-size 100M
# Only small files
rclone copy /source remote:dest --max-size 10M
# Size range
rclone copy /source remote:dest \
--min-size 1M \
--max-size 100MBy Age
# Files from last week
rclone copy /source remote:dest --max-age 7d
# Files older than 30 days
rclone copy /source remote:dest --min-age 30d
# Specific date range
rclone copy /source remote:dest \
--max-age 2024-01-01 \
--min-age 2023-01-01By Name Pattern
# Include patterns
rclone copy /source remote:dest \
--include "*.jpg" \
--include "*.png"
# Exclude patterns
rclone copy /source remote:dest \
--exclude "*.tmp" \
--exclude "~*"
# Complex filters from file
echo "- *.tmp" > filters.txt
echo "+ *.jpg" >> filters.txt
rclone copy /source remote:dest --filter-from filters.txtBandwidth Limiting
# Limit to 1MB/s
rclone copy /source remote:dest --bwlimit 1M
# Time-based limits
rclone copy /source remote:dest \
--bwlimit "08:00,512k 18:00,10M 23:00,off"Preserve Metadata
# Keep timestamps and permissions
rclone copy /source remote:dest \
--metadata \
--links \
--timesVerification
# Verify after transfer
rclone copy /source remote:dest --checksum
# Extra paranoid mode
rclone copy /source remote:dest \
--checksum \
--check-firstTroubleshooting
"File already exists"
This is not a problem.
Copy skips existing identical files. Use --update to overwrite older files only.
Slow Transfers
# Increase parallelism
--transfers 32 --checkers 16
# For large files
--drive-chunk-size 128M
# For many small files
--fast-listRunning Out of Memory
# Reduce buffer size
--buffer-size 16M
# Fewer concurrent transfers
--transfers 4Interrupted Transfers
Your past progress is saved, so just run the same command again. Rclone will skip already-copied files and resume where it left off.
Remember: When in doubt, use copy instead of sync! It's always safer to have extra files than to accidentally delete something important.
For tips, check the tips & tricks.
How is this guide?