Wanting to run a simple script using Pandas DataFrame I had a 413 error stumbling upon the 48MB restriction of the serverless function. See also a similar question here Payload too large The popular packages Pandas, Numpy are around 70 MB themselves.
I’ve been looking to reduce the size of the packages as suggested on this article How to shrink NumPy, Pandas… However setting the CFLAG to compile the package led to long deployment times.
Not being familiar with optimizing pip installs, is there a way to install/compile Pandas to get it to work within the 48 MB? Or increase the limit to 200 MB?
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
We don’t recommend approaches like changing how the non-Python code in packages is compiled to try to reduce the package size, because this is difficult for us to support.
We did hear from many Functions customers that having packages like
numpy
andpandas
included in the Python runtime would be helpful because it would let them deploy functions without exceeding the 48 MB limit. With our recently-released Python 3.11 runtime, we have included these packages as well as a few others.Also having this issue. 48MB limit on serverless functions is absolutely deadly, it basically axes a huge portion of libraries. Having preinstalled packages like numpy is nice, but that doesn’t help if you have a package you need that is simply impossible to fit within the limitation.
AWS Lambda offers a means of using layers to allow up to 250MB which I think it a more reasonable limit.
Any updates on this? Being unable to install pandas numpy scipy is quite frustrating