Bad gateway error when loading large sklearn model. Django/Ubuntu/Gunicorn/Nginx

Dear community,

I just developed a machine learning driven python 2.7 application which essentially loads a sklearn model with joblib and predicts a given dataset. On local host the application works perfectly, but when I deploy the django application on digital ocean (Ubuntu 18., gunicorn, nginx) I get a bad gateway error. I found out that the error is being raised when the model is being loaded with joblib. What I noticed as well is that the application works perfectly when a model that has the size of 8000 bites is being loaded but that but that the bad gateway error is being raised when I try to load a model of 60000 bites.

What I have tried is to resize the droplet in order to avoid that the error is being caused by a lack of computing resources, but I still get the error!

Unfortunately, I have no clue how to tackle the problem. So I hope that you guys can help me as maybe anyone here has experienced the same issue before or knows how this problem could be analysed and resolved.

I appreciate any help

Kind regards


Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.

Hi Marcel,

Sounds like an nginx timeout error because of the load time of the model. Might have to increase the 504 error time. I am not too sure on the actual issue but first link on google:

Let us know if that works. Thanks 😃