Question

Spaces/rclone: unable to list or copy (403 SignatureDoesNotMatch and 400 BucketName)

Posted September 21, 2021 77 views
DigitalOcean Spaces

rclone running on Ubuntu 21. Spaces set up. Access credentials are double-checked as correct. ACL set to private

cat ~/.config/rclone/rclone.conf
[hab-sync]
type = s3
provider = DigitalOcean
env_auth = false
access_key_id = <correct_access_key>
secret_access_key = <correct_secret>
region = ""
endpoint = sfo3.digitaloceanspaces.com
acl = private

Version:

rclone v1.56.1
- os/version: ubuntu 21.04 (64 bit)
- os/kernel: 5.11.0-25-generic (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.16.8
- go/linking: static
- go/tags: none

Running ls:

rclone ls hab-sync:/
(nothing)

Running mkdir:

rclone -vv mkdir hab-sync:5fb225099ba3874e9767b
2021/09/21 13:16:50 DEBUG : rclone: Version "v1.56.1" starting with parameters ["rclone" "-vv" "mkdir" "hab-sync:5fb225099ba3874e9767b"]
2021/09/21 13:16:50 DEBUG : Creating backend with remote "hab-sync:5fb225099ba3874e9767b"
2021/09/21 13:16:50 DEBUG : Using config file from "/home/ac/.config/rclone/rclone.conf"
2021/09/21 13:16:50 DEBUG : S3 bucket 5fb225099ba3874e9767b: Making directory
2021/09/21 13:16:51 ERROR : Attempt 1/3 failed with 1 errors and: SignatureDoesNotMatch: 
    status code: 403, request id: , host id: 
2021/09/21 13:16:51 DEBUG : S3 bucket 5fb225099ba3874e9767b: Making directory
2021/09/21 13:16:52 ERROR : Attempt 2/3 failed with 1 errors and: SignatureDoesNotMatch: 
    status code: 403, request id: , host id: 
2021/09/21 13:16:52 DEBUG : S3 bucket 5fb225099ba3874e9767b: Making directory
2021/09/21 13:16:52 ERROR : Attempt 3/3 failed with 1 errors and: SignatureDoesNotMatch: 
    status code: 403, request id: , host id: 
2021/09/21 13:16:52 DEBUG : 4 go routines active
2021/09/21 13:16:52 Failed to mkdir: SignatureDoesNotMatch: 
    status code: 403, request id: , host id: 


Attempting to copy works oput of the blue, after erroring:

rclone copy ./engine hab-sync:smartrooms-hab-sync/5fb225099ba3874e9767b/engine
2021/09/21 13:04:23 ERROR : docker-compose.yml: Failed to copy: s3 upload: 400 Bad Request: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InvalidArgument</Code><BucketName>smartrooms-hab-sync</BucketName><RequestId>tx00000000000000282786f-00614a3ac7-b55a2a5-sfo3a</RequestId><HostId>b55a2a5-sfo3a-sfo3-zg01</HostId></Error>

Then

clone -vv copy ./engine hab-sync:smartrooms-hab-sync/5fb225099ba3874e9767b/engine
2021/09/21 13:17:33 DEBUG : rclone: Version "v1.56.1" starting with parameters ["rclone" "-vv" "copy" "./engine" "hab-sync:smartrooms-hab-sync/5fb225099ba3874e9767b/engine"]
2021/09/21 13:17:33 DEBUG : Creating backend with remote "./engine"
2021/09/21 13:17:33 DEBUG : Using config file from "/home/ac/.config/rclone/rclone.conf"
2021/09/21 13:17:33 DEBUG : fs cache: renaming cache item "./engine" to be canonical "/srv/openhab/engine"
2021/09/21 13:17:33 DEBUG : Creating backend with remote "hab-sync:smartrooms-hab-sync/5fb225099ba3874e9767b/engine"
2021/09/21 13:17:33 DEBUG : S3 bucket smartrooms-hab-sync path 5fb225099ba3874e9767b/engine: Waiting for checks to finish
2021/09/21 13:17:33 DEBUG : S3 bucket smartrooms-hab-sync path 5fb225099ba3874e9767b/engine: Waiting for transfers to finish
2021/09/21 13:17:33 DEBUG : docker-compose.yml: md5 = bb9f232ef81d3af978d5a9c21a0f5ac7 OK
2021/09/21 13:17:33 INFO  : docker-compose.yml: Copied (new)
2021/09/21 13:17:33 DEBUG : prepare.sh: md5 = 3d827dfb47dc50c31989ec17cc73426d OK
2021/09/21 13:17:33 INFO  : prepare.sh: Copied (new)
2021/09/21 13:17:33 INFO  : 
Transferred:            978 / 978 Byte, 100%, 0 Byte/s, ETA -
Transferred:            2 / 2, 100%
Elapsed time:         0.3s

2021/09/21 13:17:33 DEBUG : 9 go routines active

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

×
Submit an Answer
1 answer

Hey @acShark

If you would like to copy files from one bucket to another, you can use rclone sync command.

You can also use s3cmd if you want to manage your bucket.

Installation: https://docs.digitalocean.com/products/spaces/resources/s3cmd/

Usage: https://docs.digitalocean.com/products/spaces/resources/s3cmd-usage/