Drift Chamber Simulation keeps getting killed

Hi,
I’m simulating a Drift Chamber and use Comsol as a field solver and adapted the Comsol example. 100 muons passing through the chamber at different distances from the anode wire and I want to collect all the data from drift.GetEndPoint(xf, yf, zf, tf, status) and save it in a file. I have run it before with distance spacings of 1 cm and it worked fine. Now I tried to rerun the code with a spacing of 1 mm and the program keeps taking too long and gets killed. I use a gas file that covers energies from 100 V/cm to 500000 V/cm in 20 steps with a logarithmic spacing because I have electric potentials of -10000 V to 2000 V. I have tried different gas, mesh and potential files to see if it is maybe too much to load into garfield but nothing changed. This is my code:

#include <cstdlib>
#include <iostream>
#include <fstream>

#include <sys/stat.h>
#include <sys/types.h> 

#include <TApplication.h>
#include <TCanvas.h>
#include <TH1F.h>

#include "Garfield/ComponentComsol.hh"
#include "Garfield/ViewField.hh"
#include "Garfield/ViewFEMesh.hh"
#include "Garfield/MediumMagboltz.hh"
#include "Garfield/Sensor.hh"
#include "Garfield/AvalancheMicroscopic.hh"
#include "Garfield/AvalancheMC.hh"
#include "Garfield/Random.hh"
#include "Garfield/DriftLineRKF.hh"
#include "Garfield/TrackHeed.hh"

using namespace Garfield;

int main(int argc, char * argv[]) {

  TApplication app("app", &argc, argv);

  // Load the field map.
  ComponentComsol fm;
  fm.Initialise("mesh.mphtxt", "dielectrics.dat", "field.txt", "mm");
  fm.PrintRange();


  // Setup the gas.
   MediumMagboltz gas;
   gas.LoadGasFile("ar_82_co2_18_293K_1bar_500000.gas");
  // Load the ion mobilities.
  const std::string path = std::getenv("GARFIELD_INSTALL");
  gas.LoadIonMobility(path + "/share/Garfield/Data/IonMobility_Ar+_Ar.txt");
  // Associate the gas with the corresponding field map material.
  fm.SetGas(&gas); 
  fm.PrintMaterials();
 
  // Create the sensor.
  Sensor sensor;
  sensor.AddComponent(&fm);
  //sensor.SetArea(-25.2, 0, -0.5, 0, 0.1, 0.5);


  // Set up Heed.
  TrackHeed track;
  track.SetParticle("muon");
  track.SetEnergy(3.e9);
  track.SetSensor(&sensor);

  // RKF integration.
  DriftLineRKF drift(&sensor);
  //drift.SetGainFluctuationsPolya(0., 20000.);
  
  
  // Create a new directory
  const std::string outputDir = "electron_positions_mu_3GeV_293K_1mm_test/";
  mkdir(outputDir.c_str(), 0777);
  
  for (double x0 = -0.1; x0 >= -25.2; x0 -= 0.1) {
        std::ofstream outFile(outputDir + "electron_positions_" + std::to_string(x0) + ".txt");
        if (!outFile) {
            std::cerr << "Fehler beim Erstellen der Datei!" << std::endl;
            return 1;
        }

        // Write the header
        outFile << "x y z E t x_e y_e z_e t_e\n";

        constexpr unsigned int nEvents = 100;
        for (unsigned int i = 0; i < nEvents; ++i) {
            std::cout << i << "/" << nEvents << "\n";
            sensor.ClearSignal();
            track.NewTrack(x0, 0.05, 0.496, 0, 0, 0, -1);
            for (const auto& cluster : track.GetClusters()) {
                for (const auto& electron : cluster.electrons) {
                    double xe = electron.x;
                    double ye = electron.y;
                    double ze = electron.z;
                    double ee = electron.e;
                    double te = electron.t;

                    drift.DriftElectron(electron.x, electron.y, electron.z, electron.t);
                    double xf = 0., yf = 0., zf = 0., tf = 0.;
                    int status = 0;
                    drift.GetEndPoint(xf, yf, zf, tf, status);

                    // Write the data to the file
                    outFile << xe << " " << ye << " " << ze << " " << ee << " " << te << " " << xf << " " << yf << " " << zf << " " << tf << "\n";
                }
            }
        }

        outFile.close();
    }

  app.Run(kTRUE);
}

And this is the terminal output:

Do you have any idea what could be the reason for the program to get stopped after a specific time? I would appreciate your help, thanks!

Dear @JFBW

How exactly did you scale your geometry from 1cm spaced wires to your 1mm spaced wires? Did you make a new geometry in comsol and exported new fields or did you modify “cm” into “mm” in the line below?

I would suggest to keep an eye on GPU and Mem use of your pc and check if there is something anomalous ongoing and try to add more debug output: how many clusters are there for each track and what is the gain for each drifted electron? These could help you (us) to understand better what is going (wr)on(g).
greets
Piet

Dear @Piet,

Thanks for your quick reply! Sorry, I forgot to mention that the chamber has only one anode wire. The Comsol files are just the mesh and field map in mm as built in Comsol. By trying other meshes I meant building a new mesh in Comsol and exporting it along with the updated field map, the “mm” was kept as it was not changed in Comsol. By spacing I just mean the initial positions of the muons passing through the chamber. So the anode wire is at x=0 and the muons start from 0.1 cm anode distance to 25.2 cm anode distance in 1 cm steps for the first simulation where everything went well and now in 0.1 cm steps. This is represented by this line

The gain varies very much from about 75 to 10⁹, I think because all the secondary electrons lost by e.g. attachment don’t reach the gain region at x=0. The number of clusters varies from 25 to 35, which seems fine since the detector is only 1 cm high. I have tried reducing the number of events, and using 1 instead of 100 events works fine.
I’ve also checked the RAM and swap file and they don’t seem to be working at their limits. But for the calculation only one of the 8 CPUs is used, which is running at 100%…
Is there a way to implement the use of more CPUs in parallel?

Dear @JFBW

(1) Thanks for your information … so your geometry is more a Drift Tube with a single wire, not a (multi-wire) Drift Chamber. Do you have a circular geometry? what is the radius of your (outer) tube and of your anode wire? In case it is more a parallel plane geometry, what exactly are the cathodes and how far are they from the wire? Can you make a small drawing of your elementary cell you are simulating? Understanding your geometry is fundamental to get an idea of what you are doing.

(2) A wire-based detector typically operates at a gain of 1-few 10^5, with the exception of drift chambers that want to do cluster-counting, there gains can be few 10^6. A gain of 10^9 seems definitely too high and not a realistic condition. Could you think of trying to simulate at lower voltage?

(3) to run with more CPUs in parallel you should have a look (study) to run garfield++ in multithread mode. I know some groups are working on this code (using MPI), but I am not sure this code has been integrated or whether there are examples available. Maybe @hschindl can tell?