In the code before staring a loop over data
I create roughly 100 histograms and many of them
are TH2D with 200x200 binning.
Every 50000 event I dump the content of the gDirectory into the file,
gDirectory->Write(“hist_Dir”, TObject::kOverwrite);
that allows to observe the progress during the running.
I noticed that now when I created one more histograms it crashes during at
gDirectory->Write(“hist_Dir”, TObject::kOverwrite);
I wonder whether there is a maximum size restriction on TDirectory that can be written,
or there is something else here
100 histograms is a very little amount of data: data quality monitoring of LHC experiments deal with tens of thousands of them.
Could you provide a minimal example that shows your issue?
Thanks for replay.
Now I am not able reproduce it,
It works now. I didn’t find out what was the exact reason, but
creating a small script with thousands of large size
histograms it worked well again.
So Sorry for that
Rafo
[quote=“dpiparo”]Hi,
100 histograms is a very little amount of data: data quality monitoring of LHC experiments deal with tens of thousands of them.
Could you provide a minimal example that shows your issue?