FAQ
How to fix quota reached/exceeded?
Solutions for quota and rate limit errors
How to fix quota reached/exceeded errors
Getting quota errors? Don't panic! Here's how to diagnose and fix them:
Types of quotas
Storage Quota
This means your cloud storage is full
- "Quota exceeded"
- "Storage limit reached"
- "No space left"
- "User rate limit exceeded"
API Rate Limits
This means you're making requests too fast
- "Rate limit exceeded"
- "Too many requests"
- "429 Error"
- "Bandwidth quota exceeded"
Transfer Quota
This means you've hit daily transfer limits
- "Download quota exceeded"
- "Daily limit reached"
- "Bandwidth limit exceeded"
Quick fixes
For Storage Quota
-
Check your usage
# See how much space you're using rclone about remote: # Get size of specific folder rclone size remote:folder -
Free up space
# Find large files rclone ls remote: --min-size 100M # Find old files rclone lsl remote: --max-age 365d # Remove duplicates (carefully!) rclone dedupe remote: --dedupe-mode newest -
Empty trash
# Google Drive rclone cleanup remote: # Other providers may need web interface
For API Rate Limits
-
Slow down transfers
# Limit transactions per second rclone copy /local remote: --tpslimit 10 # Add delay between API calls rclone copy /local remote: --tpslimit-burst 1 -
Reduce parallel transfers
# Default is 4, reduce to 1 rclone copy /local remote: --transfers 1 -
Use bandwidth limits
# Limit to 1 MB/s rclone copy /local remote: --bwlimit 1M # Limit only during work hours rclone copy /local remote: --bwlimit "08:00,512k 18:00,10M" -
Implement retries with backoff
# Retry with increasing delays rclone copy /local remote: \ --retries 10 \ --retries-sleep 1s \ --low-level-retries 10 # Use --fast-list to reduce API calls rclone copy /local remote: --fast-list -
Enable caching
# Use VFS caching for frequently accessed files rclone mount remote: /mount/point \ --vfs-cache-mode full \ --vfs-cache-max-age 48h
For Transfer Quota
-
Wait it out
- Google Drive: 750GB upload/day per user
- OneDrive: 250GB upload/day
- Dropbox: Varies by plan
Most quotas reset after 24 hours.
-
Schedule transfers
# Run at night when quotas reset # Linux/Mac cron example: 0 2 * * * rclone sync /local remote: -
Spread across days
# Transfer files modified in last day only rclone copy /local remote: --max-age 24h -
Split large transfers
# Transfer in chunks by date rclone copy /photos remote:photos --max-age 30d rclone copy /photos remote:photos --min-age 30d --max-age 60d rclone copy /photos remote:photos --min-age 60d
Provider limits
Google Drive
- Upload: 750GB per day
- Download: No official limit (but ~10TB/day practical)
- Shared drives: 400,000 files limit
- Rate limit: 1000 queries per 100 seconds
Fix: Use service accounts for higher limits
OneDrive
- Upload: 250GB per file, 100GB via web
- Total storage: Varies by plan (5GB free, 1TB with Office 365)
- Path length: 400 characters total
Fix: Upgrade plan or use multiple accounts
Dropbox
- Rate limiting: Aggressive for free accounts
- Upload: 350GB per day (Business)
- Bandwidth: Varies by plan
Fix: Use --dropbox-chunk-size 48M
AWS S3
- Requests: 3,500 PUT/POST per second
- Storage: Unlimited (pay per use)
- Bandwidth: Unlimited (pay per use)
Fix: Usually not quotas but costs!
How is this guide?