Memory leak of Garfield++?

Dear experts

I simulated ~10k tracks using Heed+ and drift using AvalancheMC, but the job was killed on server because of insufficient memory.

Then I tried to run the DriftTube example on my local PC to monitor memory usage, I found it may have memory leak:
I tried 3 track (original number written in the code) it cost ~180Mb memory,
then I tried 1k track, it cost ~480Mb and was linearly increasing when it’s running.
So if I try like 100k I believe my PC will be not able to afford the memory requirement…
(I disabled plotting when simulate large number of tracks)
Maybe new track caused memory leak or I misunderstood something?

Thanks for reporting this! Unfortunately I’m not able to reproduce it. Are you 100% sure you disabled plotting? If you don’t (and if you don’t clear the plot once in a while), ViewDrift will keep accumulating drift lines and the memory consumption will indeed grow with every track that you simulate.

Here is what exactly I ran:

//I didn't show codes before since I didn't modify
const double rTrack = 0.3;
  const double x0 = rTrack;
  const double y0 = -sqrt(rTube * rTube - rTrack * rTrack);
  const unsigned int nTracks = 10000;
  for (unsigned int j = 0; j < nTracks; ++j) {
    sensor.ClearSignal();
    track.NewTrack(x0, y0, 0, 0, 0, 1, 0);
    std::cout<<"test"<<std::endl;
    double xc = 0., yc = 0., zc = 0., tc = 0., ec = 0., extra = 0.;
    int nc = 0;
    while (track.GetCluster(xc, yc, zc, tc, nc, ec, extra)) {
      for (int k = 0; k < nc; ++k) {
        double xe = 0., ye = 0., ze = 0., te = 0., ee = 0.;
        double dx = 0., dy = 0., dz = 0.;
        track.GetElectron(k, xe, ye, ze, te, ee, dx, dy, dz);
        drift.DriftElectron(xe, ye, ze, te);
      }
    }
    std::cout<<j<<" finished"<<std::endl;
/*    
    if (plotDrift) {
      cD->Clear();
      cellView.Plot2d();
      constexpr bool twod = true;
      constexpr bool drawaxis = false;
      driftView.Plot(twod, drawaxis);
    }
    sensor.ConvoluteSignals();
    int nt = 0;
    if (!sensor.ComputeThresholdCrossings(-2., "s", nt)) continue;
    if (plotSignal) signalView.PlotSignal("s");
*/   
  }

I tried 10k this time to see the linear increasing of memory usage, and it does increased linearly.

If you didn’t change this part of the code:

constexpr bool plotDrift = true;
if (plotDrift) {
  cD = new TCanvas("cD", "", 600, 600);
  cellView.SetCanvas(cD);
  cellView.SetComponent(&cmp);
  driftView.SetCanvas(cD);
  drift.EnablePlotting(&driftView);
  track.EnablePlotting(&driftView);
}

then driftView will accumulate all the drift lines you simulate and so your memory usage will increase.

You are right, I just disabled plotting (action) but didn’t disabled drift line calculations, now I understood, thank you!

Just to be precise: you just need to set the flag plotDrift to false. You still need to simulate the drift lines of course if you want to calculate the induced signal.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.