Question
Valgrind can't run: out of memory?
I try to run a simple test with valgrind but it seems that valgrind can’t even start because of some memory limitation issue.
Command:
valgrind --leak-check=full --track-origins=yes echo 'test'
Output:
==26260== Valgrind's memory management: out of memory:
==26260== initialiseSector(TC)'s request for 27597024 bytes failed.
==26260== 143327232 bytes have already been allocated.
==26260== Valgrind cannot continue. Sorry.
Only 355
/490
MB of my total RAM is consumed so I would guess that is not the reason behind this since valgrind is such a small tool. Also I run the test inside a docker container if that matters.
Any idea on how to solve?
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
×
To eliminate some kind of system limit as the culprit, can you look at the output to the following from a docker container:
I looked at that actually but my virtual memory looks fine. Not sure if there is some other kind of memory to look at but here it is:
@manossef - What was the ’-m’ parameter you gave your docker run command?
Hi, I am not sure what you mean. Btw the command you gave gives me the same output both in the terminal of the server (which is a docker image) and an own Docker image. I noticed that on my personal computer this issue is not present, it’s only on the server at Digital Ocean.
On a side note I was getting the error message mentioned when trying to run valgrind in my already image. I tried to run valgrind in a totally new docker image and I still have problems but this time instead I get
==231== Memcheck, a memory error detector
==231== Copyright (C) 2002-2013, and GNU GPL'd, by Julian Seward et al.
==231== Using Valgrind-3.10.0.SVN and LibVEX; rerun with -h for copyright info
==231== Command: echo test
==231==
Killed
So while before it gave an error, now it doesn’t run at all (in a different image).
Here is a way to reproduce the problem on a server:
So it doesn’t even run..
@manossef - When you run docker, you can give it a –memory or -m parameter for the maximum memory to use. For example, to give a container a maximum of 375Mb memory