Question

Has anyone used aws-sdk to operate on DO Spaces? How to do it?

I’m trying to wrap my head around this but no luck thus far. The DO Documentation states that the Spaces API is pretty much designed .to be used as S3 in most cases. So i’ve been trying to use aws-sdk for NodeJS, to create buckets in Spaces and send files there. Here is what I’m trying:

var AWS = require("aws-sdk");
var EP = new AWS.Endpoint("kdsuserdata.nyc3.digitaloceanspaces.com");

fs.readFile(file.path, function (err_file, data) {
          if (err_file) throw err_file; // Something went wrong!
          var s3bucket = new AWS.S3({endpoint: EP, params: {Bucket: result._id}});  //MongoDB User id
          s3bucket.createBucket(function(){
            var params = {Key: file.name, Body: file}; 
            s3bucket.upload(params, function(err_s3, data_s3){
              fs.unlink(file.path, function (err) {
                if (err) {     
                  console.error(err);             
                }              
                console.log('Temp File Delete');
              });              
              if(err_s3) throw err_s3;        
  
              return res.json({result: data_s3, err: null});
            })                 
          })
       });

With this I’m getting the error below:

/Users/jh/Documents/dash/node_modules/aws-sdk/lib/request.js:31
            throw err;
            ^

Error: Unsupported body payload object
at ManagedUpload.self.fillQueue (/Users/jh/Documents/dash/node_modules/aws-sdk/lib/s3/managed_upload.js:90:21)
    at ManagedUpload.send (/Users/jh/Documents/dash/node_modules/aws-sdk/lib/s3/managed_upload.js:199:33)
    at features.constructor.upload (/Users/jh/Documents/dash/node_modules/aws-sdk/lib/services/s3.js:1067:50)
    at Response.<anonymous> (/Users/jh/Documents/dash/server/api/client/client.controller.js:217:22)
    at Request.<anonymous> (/Users/jh/Documents/dash/node_modules/aws-sdk/lib/request.js:364:18)
    at Request.callListeners (/Users/jh/Documents/dash/node_modules/aws-sdk/lib/sequential_executor.js:105:20)

Any help will be appreciated.


Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.

I don’t know am I late or not, but I played with this and managed to come up with some solution. Yes, you can use aws-sdk, and it works. Using your code, I ended up with new bucket and with file in it.

I used the following code:

var AWS = require("aws-sdk");
var EP = new AWS.Endpoint("nyc3.digitaloceanspaces.com");

var s3bucket = new AWS.S3({endpoint: EP, params: {Bucket: 'testabc'}}); 
s3bucket.createBucket(function(){
    var params = {Key: 'Test1', Body: 'Hello!'};
    s3bucket.upload(params, function(err_s3, data_s3){
        if(err_s3) throw err_s3;
        console.log(data_s3)
        console.log(err_s3)
    })
})

This created me bucket testabc with file Test1 in it. As you can see from your code, you made mistake in EP variable. The endpoint format for bucket is {BUCKET}.{REGION}.digitaloceanspaces.com. As you are creating new bucket, you should drop it from the endpoint, so you will have {REGION}.digitaloceanspaces.com.

This comment has been deleted

Hello! I also try to configure the space instead of AWS, as far as I can see that authorization does not work. I get the error 403. How to sign the header in this case? Do I understand correctly that I need to rewrite everything to aws4 authorization?

AWS.config.accessKeyId = settings.aws.AWS_ACCESS_KEY_ID;
AWS.config.secretAccessKey = settings.aws.AWS_SECRET_ACCESS_KEY;

const bucketName = settings.aws.S3_BUCKET;
const EP = new AWS.Endpoint("nyc3.digitaloceanspaces.com");
const s3bucket = new AWS.S3({ endpoint: EP });

const s3Params = {
    Bucket: bucketName,
    Key: fileName,
    Expires: 60,
    ContentType: file.type,
    ACL: 'public-read',
};

s3bucket.getSignedUrl('putObject', s3Params, (err, data) => {
    console.log(err, data)
    if (err) {
        debug('image loading error', err);
    }	else {
        const returnData = {
            requestUrl: data,
            imageUrl: `${settings.cdn.full}${fileName}`,
        };
        res.json(ImageResponse(returnData));
    }
});