Hello,
I’m trying to upload a file accoring to these instructions: https://www.digitalocean.com/community/questions/how-to-upload-an-object-to-digital-ocean-spaces-using-python-boto3-library
This is my code:
#!/bin/env python3
import boto3
boto3.set_stream_logger('')
key = 'KEY'
secret = 'SECRET'
space_name = 'my-space-name'
filename = 'test.7z'
session = boto3.session.Session()
client = session.client('s3',
region_name='ams3',
endpoint_url='https://ams3.digitaloceanspaces.com',
aws_access_key_id=key,
aws_secret_access_key=secret)
client.upload_file(filename, # Path to local file
space_name, # Name of Space
filename) # Name for remote file
However, the script never terminates and the upload fails. This is a part of the debug output:
PUT
/MYFILENAME
content-md5:7eHKybMEoOqUlWWM5c+5Qg==
host:ams3.digitaloceanspaces.com
x-amz-content-sha256:UNSIGNED-PAYLOAD
x-amz-date:20180104T224246Z
content-md5;host;x-amz-content-sha256;x-amz-date
UNSIGNED-PAYLOAD
2018-01-04 23:42:46,436 botocore.auth [DEBUG] StringToSign:
AWS4-HMAC-SHA256
20180104T224246Z
20180104/us-east-1/s3/aws4_request
63867e94f9f05f2dcbcad9a3c71c57b29b31f3fca17160ccce9214c174eac402
2018-01-04 23:42:46,436 botocore.auth [DEBUG] Signature:
a577db0b4b5d6127cb787be070b26ed368fff7e7c0883588f771cb5ca26d3f19
2018-01-04 23:42:46,436 botocore.hooks [DEBUG] Event request-created.s3.PutObject: calling handler <function signal_transferring at 0x7f6df549a268>
2018-01-04 23:42:46,437 botocore.endpoint [DEBUG] Sending http request: <PreparedRequest [PUT]>
2018-01-04 23:42:46,438 botocore.vendored.requests.packages.urllib3.connectionpool [INFO] Starting new HTTPS connection (3): ams3.digitaloceanspaces.com
2018-01-04 23:42:46,602 botocore.awsrequest [DEBUG] Waiting for 100 Continue response.
2018-01-04 23:42:46,633 botocore.awsrequest [DEBUG] 100 Continue response seen, now sending request body.
Eventually the connection times out and a new attempt will be made.
I can upload a very small file containing only test123. However, also there the scipt does not terminate as well. But the file will be visible in the space. The file I’m actually trying to upload is about 300kb.
I would greatly appreciate any help you might be able to give. Thank you very much!
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
Most likely this was some sort of temporary local networking issue or issue with Spaces itself. If you experience something like this, it’s worth checking status.digitalocean.com. If that is not the case, there are some setting you can change that might help.
When uploading large files with the upload_file() function, boto3 will automatically use a “multipart upload.” This breaks the file into smaller chunks and uploads them separately. This can be very useful to prevent timeouts. By default, this will only happen for files over 8 MB, but this behavior is configurable. If you are still experiencing this problem and it wasn’t just a transient network issue, you might want to try setting a smaller threshold. For example, you could try:
config = boto3.s3.transfer.TransferConfig(multipart_threshold=50000,
multipart_chunksize=50000)
client.upload_file(filename,
space_name,
filename,
Config=config)
See more about the available config options and their defaults here:
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Full documentation for every DigitalOcean product.
The Wave has everything you need to know about building a business, from raising funding to marketing your product.
Stay up to date by signing up for DigitalOcean’s Infrastructure as a Newsletter.
New accounts only. By submitting your email you agree to our Privacy Policy
Scale up as you grow — whether you're running one virtual machine or ten thousand.
Sign up and get $200 in credit for your first 60 days with DigitalOcean.*
*This promotional offer applies to new accounts only.