I deployed a serverless function on digital ocean with memory of 1024 MB and timeout of 15 minutes.
When I try to download a file of 250 MBs in memory, it works pretty fine. But I try for file of 300MBs or so, it fails every time and shows error: “The function exhausted its memory and was aborted. Logs might be missing.”
When I logged the memory usage while downloading file of 250MBs, it showed me 1GB of total memory and unused memory of over 700MBs.
**Why my function can not download files of 300MBs or so ? **
I would appreciate the efforts to answer this question. Thank you.
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
Sign up for Infrastructure as a Newsletter.
Working on improving health and education, reducing inequality, and spurring economic growth? We'd like to help.
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Heya,
The current memory is limited from 128 MB – 1 GB, defaulting to 256 MB.
You can review your code to ensure that it’s using memory efficiently. Make sure that any unnecessary data structures or objects are being cleaned up properly to free up memory. Check for any memory leaks in your code.
Another approach - instead of trying to download the entire file into memory at once, consider using a streaming or chunked approach. Download the file in smaller parts and process them sequentially. This can reduce the peak memory usage during the operation.
Regards