Question

POST request to server doesn't work after some size

Posted September 18, 2020 172 views
APIR

I have successfully deployed my R plumber file to digital ocean ubuntu-s-1vcpu-2gb-fra1-01. The API works as expected, but I have a problem when I try to send large files through the network. For example, I can send a vector (which is probably internally transformed to json) of the size 100.000, but I can’t send the vector of the size 200.000. Since vector of the size > 100.000 is normal in data science, is it possible to somehow change server configs to allow for greater files?

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

×
1 answer

Hi @MislavSag,

The post size limit is based on the WebService you are using - Apache/Nginx or something else entirely. Anyway, there are limits in each WebService you can adjust to increase the maximum allowed size of a post request. For instance if you are using Apache the limit is set via the LimitRequestBody directive and defaults to 0:

This directive specifies the number of bytes from 0 (meaning unlimited) to 2147483647 (2GB) that are allowed in a request body.

If you are using Nginx, you can use the client_max_body_size directiv in your Nginx config - nginx.conf .

This directive can also be added in conf.d folder, so that system updates will not conflict with modified nginx.conf. In such a case, we can save the file as /etc/nginx/conf.d/uploads.conf and it will be automatically included in the server configuration. The name is not really important here, it just need to end with .conf. When using a separate file, the wrapping http part must be omitted, as it will be included in http block, the file should only contain the upload size directive:

client_max_body_size 2000M;

Regards,
KFSys

Submit an Answer