Garfield gas generator using multiple threads

Hello everyone,

I am using Garfield++ to compute gas properties using this example as baseline.

I have tried to paralelize it using threads but this does not seem to work, possibly because of some global variables as stated in this thread.

This is the code I am using:

// main.cpp
#include <iostream>
#include <thread>

#include "Garfield/FundamentalConstants.hh"
#include "Garfield/MediumMagboltz.hh"

using namespace std;
using namespace Garfield;

int main(int argc, char** argv) {
    const int nThreads = 2;
    std::vector<thread> threads;
    for (int i = 0; i < nThreads; i++) {
        threads.emplace_back([](unsigned int threadId) {
            cout << "Thread " << threadId << endl;

            MediumMagboltz gas("Ar", 93., "CO2", 7.);
            const double pressure = 3 * AtmosphericPressure;
            const double temperature = 293.15;

            // one would use different e-field values for each thread and then merge the results
            gas.SetFieldGrid({1000.}, {0.0}, {HalfPi});

            gas.WriteGasFile("demo" + to_string(threadId) + ".gas");

    for (auto& t: threads) {

    return 0;

Which can be compiled with the following cmake:

// CMakeLists.txt
find_package(Garfield REQUIRED)
add_executable(demo main.cpp)
target_link_libraries(demo PUBLIC Garfield::Garfield)

My question is: can this paralelization be achieved some other way? I have also had problems when running multiple single threaded jobs on a cluster to achieve the same effect, some jobs were getting stuck, as if there was some global state accross the different processes… this seemed very wierd to me so perhaps I was doing something wrong.

Anyway, I could only get the paralelization to work running the jobs as individual docker containers. I would appreciate any advice on this.


Hi @lobis; I am sure @hschindl can help you with that.


the function GenerateGasTable (which is essentially an interface to calling the Fortran Magboltz program) is indeed not thread-safe.

Running multiple single-threaded jobs should work though. Maybe check if the amount of memory you request in the job submission file is sufficient.

1 Like

Thanks! I reviewed my code / job submissions and found the problem to be a low amount of jobs failing randomly, some of them due to this issue. Perhaps this is fixed already in the latest version. I can get around this problem so its not blocking but if I can find a consistent way to reproduce the problem I will share it. Thanks again!