Question
DigitalOcean Spaces: cannot download directory with more than 1000 files
My goal is to recursively download a directory from DigitalOcean spaces to local machine (nothing fancy).
If I attempt to use AWS CLI for the job, aws s3 commands download only first 1000 files from any directory that contains more than 1000 files.
To reproduce:
- Create directory
test1
with 1007 files locally - Upload this directory to spaces:
aws s3 cp --recursive --endpoint=https://reg1.digitaloceanspaces.com s3://mybucket/test1/ ./test1/
- Upload works as expected (uploads all 1007 files)
- Try to download this directory locally:
aws s3 cp --recursive --endpoint=https://reg1.digitaloceanspaces.com s3://mybucket/test1/ ./download1/
- Downloaded directory contains only 1000 files
Adding argument –page-size=500 to aws s3 cp command, downloads only first 500 files, so it downloads only the first page.
Reproduced on Linux, macOS, multiple AWS CLI, multiple Python versions.
Is downloading entire bucket or directory not possible with AWS CLI? This seems like a very basic feature that should just work.
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
×
Uhh, how can you implement an “S3 compatible” option and call it PRODUCTION when it only lists 1000 files? This is a horribly crippling issue.
Where is someone from DO on this? This should never have rolled out live. This breaks so many usage models.