Hi everyone…
I am relatively new to ROOT and currently using it for a project involving the analysis of large datasets (~200 GB). I’ve been encountering memory issues, and I suspect it has something to do with my lack of understanding regarding ROOT’s memory management, particularly when handling TTrees and histograms.
Here’s my workflow:
- I load a large TTree from a ROOT file and apply some selection criteria.
- I loop over the selected entries to fill histograms for further analysis.
- As I process more data files in the same session, the memory usage keeps increasing until my system becomes unresponsive.
I’ve tried using TTree::SetCacheSize()
and splitting the analysis into smaller chunks, but the problem persists. I’ve also experimented with deleting objects using delete
and calling gDirectory->Clear()
, but I might not be using them correctly.
Could someone clarify the best practices for managing memory in ROOT? Specifically:
- What’s the proper way to clean up memory when working with large datasets in a single ROOT session?
- Are there common pitfalls when dealing with TTrees or histograms that might cause memory leaks?
- Is there a more efficient way to structure the analysis to avoid these issues altogether?
I have not found any solution. Any advice, examples, or documentation references would be greatly appreciated. Thanks in advance for your help!