Core dump in binary

I get a core dump and I’m not sure if it’s my fault or if I should post a bug report: As a script executed in root, these lines are no problem , but when I compile&execute these lines, I get a segmentation fault:

#include <stdlib>  
#include <math>
#include <TROOT>

TROOT hej("hej","hej"); 

int main(int argc, char** argv)

  Int_t maximal_event_number = 10000000;
  Double_t slow_spike[maximal_event_number];
  for(Int_t i=0;i<maximal_event_number;i++)
  return 0;

I know that it makes more sense to use scripts or CINT, but nevertheless this behaviour troubles me.
Can someone confirm this, or is this possibly a problem on my machine? Or am I blind, and don’t see that something in these lines is wrong?


you allocate 80MB on the stack, that won’t work. If you want to compile your program allocate them on the heap with “new”. But having an array of 80MB is usually an indication of bad design…

You are right, it is bad design. I will try to avoid these big stacks. Thanks for the answer!