Out of memory kill process

I am on 8GB RAM plan, yet facing difficulties in reading 4GB file into the python pandas dataframe. Getting out of memory error. Already allocated 4GB swap file, yet issue is not solved.

Could you pls. support?

Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.

Wow, 4GB single file reading directly in python on 8GB system? I would extend the swap space to at least 6GB (mkswap I mean) then try to ask kernel to use swap efficiently by modifying swappiness: Read it here:

What is the error? What does free -m show?