How do I enable gzip compression on Spaces?

(I’m referring to the DigitalOcean Spaces + CDN Product)

How do I enable gzip compression on web assets that I have in my Space and using a CDN for? I’ve defined the Content-Encoding value to gzip and flushed the cache but it hasn’t worked.


2022 and still not supported?

Submit an answer
You can type!ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Shame on you, digitalocean.

it’s 2019 and there is no gzip? Really?

Your “CDN” is useless, there is no way to serve static files. So why does it need for?

I know this is very sub-optimal, but I wrote a simple js code to gzip js files before pushing it to Spaces.

const gzipme = require('gzipme');
gzipme('filename', true); // true means overwrite on the file.
                    Bucket: bucketName,
                    Key: 'filename',
                    Body: base64data,
                    ACL: 'public-read',
                    ContentType: 'application/javascript;charset=utf-8', 
                    ContentEncoding: 'gzip', // important
                    CacheControl: 'max-age=86400' // optional
                }, function (resp) {
                    console.log('Successfully uploaded file:');

After doing this for all of my files, I purge CDN cache.

It works for me so far.

This is still an issue, as stated before the docs mention:

File metadata headers, like Content-Encoding, are not passed through the CDN.

Which means, while Content-Encoding headers are correctly set for the origin URLs, they are not set for the CDN (endpoint URLs).

Hence, you have a tradeoff between serving via CDN or serving with compression, which is very unfortunate.

I am guessing that an intended use case is to have Spaces act as a form of persistent storage for a server (e.g. nginx), which will retrieve the files via CDN and then serve it with compression. Might be far fetched though :)

For (Laravel) PHP you can also use gzencode - php doc - as told by to gzip encode images or other files before you send them to your CDN. And this applies to any CDN, whether it is Amazon S3 or Digital Ocean Spaces. Laravel example by Nivesh Saharan at

$disk = config(''); // Returns s3

$settings = $shop->app_settings(); // Returns an array of all the settings

$content = view()->make('scripts.settings', compact('settings'))->__toString(); // Loads a view with all the settings

$gzippedContent = gzencode($content); // gzip compress the content

$fileName = 'settings/' . $shop->domain . '.js';

// Here we upload the file to S3
$response = \Storage::disk($disk)->put($fileName, $content, [
    'visibility'     => 'public',
    'ContentType'    => 'application/javascript',
    'ContentEncoding'=> 'gzip',

Example is again a js file. Most of us want this for images especially, so you would need to change ContentType and make some other tweaks. But overall a very good example.

Yeah… this kinda sucks. Should be a simple setting for us to gzip at the server level.

This is the main blocker from us moving from cloudfront to Space to serve our static files


This has been discussed before, but from what I know, gzip compression is not fully supported on the Spaces CDN. This is most likely off to ensure the most compatability of end users who pull resources from the CDN. Even though most browsers across many devices support gzip.

You can also open a ticket wit their support team and they can fully explain if gzip compression is switched off or just some static content is compressed by default or if they’re using something else on their end.

Let us know how it goes.


Same for me compression breaks Javascript files.

Hey, did you find any solution? For me Gzip encoding is only work when in origin link.