Memory leak ?!?

Hi
quite often i encounter the following error message:

Error: Symbol #include is not defined in current scope  (tmpfile):1:
Error: Symbol exception is not defined in current scope  (tmpfile):1:
Syntax Error: #include <exception> (tmpfile):1:
Error: Symbol G__exception is not defined in current scope  (tmpfile):1:
Error: type G__exception not defined FILE:(tmpfile) LINE:1
*** Interpreter error recovered ***

Here https://twiki.cern.ch/twiki/bin/view/Main/ZhuUseROOT this is explained as a memory leakage problem. After reading this, I changed my code to make sure I delete memory that i allocated before. However, now I encountered a strange error that i do not understand. The problem occurs in a part of the code that i compile and then load as lib. Inside a subroutine I declare a 3-d array as

Double_t tra[5][100][100];

This causes the error message above. I can fix the problem by allocating the memory as

    Double_t*** tra = new Double_t**[5];
    for(int i=0;i<5;i++){
        tra[i] = new Double_t*[100];
        for (int j=0;j<100;j++){
            tra[i][j] = new Double_t[100];
        }
    }

Even if I do not (!) delete the memory allocated in this way, there seems to be no problem with memory leak. And I am a bit puzzled why this works while the first variant crashes immediately.

Try to re-run your macro after doing (once):
root [0] .except

[quote]Hi
quite often i encounter the following error message:

Error: Symbol #include is not defined in current scope  (tmpfile):1:
Error: Symbol exception is not defined in current scope  (tmpfile):1:
Syntax Error: #include <exception> (tmpfile):1:
Error: Symbol G__exception is not defined in current scope  (tmpfile):1:
Error: type G__exception not defined FILE:(tmpfile) LINE:1
*** Interpreter error recovered ***

[/quote]

Yes, as soon as ROOT in general does not use exceptions, I think in most cases this is std::bad_alloc exception (and it was not caught) - it happens if you have a memory leak or simply need too huge amount of memory.

It’s a very bad idea to declare built-in array with automatic storage duration (local variable) of such size (actually, it’s also a bad idea to declare such array with static storage duration). In case of local variable, you will have a stack overflow: en.wikipedia.org/wiki/Stack_overflow.
I do not know, may be ROOT tries to catch this as a C++ exception - it’s not very important, what ROOT says, just fix the problem.

[quote]

    Double_t*** tra = new Double_t**[5];
    for(int i=0;i<5;i++){
        tra[i] = new Double_t*[100];
        for (int j=0;j<100;j++){
            tra[i][j] = new Double_t[100];
        }
    }

Even if I do not (!) delete the memory allocated in this way, there seems to be no problem with memory leak. And I am a bit puzzled why this works while the first variant crashes immediately.[/quote]
If you want to reproduce the error message about exception, just request a huge amount of memory.

5 * 100 * 100 * 8 = 391kB … not that much … but, if that’s really the only problem, on Linux … “ulimit -s unlimited” … should do: http://cs.nyu.edu/exact/core/doc/stackOverflow.txt

I can not reproduce your problem, even if I call procedure with such an array recursively, I have *** Break *** segmentation violation, not a message about exception and header. So, either you have a memory leak in other place in your program, or something else is wrong.

Oh yeah, sure, and if you have a memory leak in your program … just buy a new computer with more memory!

No, not really. No need to buy a new computer. :unamused:
Just install a 64bit system and create a big-enough swap-file. :mrgreen:

The above advice is based on a real-life case -> due to memory leaks in a particular ROOT version (which I could not upgrade), I was unable to do some work on my 32bit system, while people using 64bit systems had no problems. While running, ROOT was allocating more and more virtual RAM and as soon as it tried to allocate more than the max. allowed 3GB RAM (on a 32bit system), it was being killed by the system. On a 64bit system, there is no 3GB limit (I estimate they were using up to 30GB virtual RAM to complete the job).