In order to have some estimates, I am trying to compute the memory size of different THnSparse histograms, depending on the number of dimensions, bins, entries. I read in the documentation that:
Bin data (content and coordinates) are allocated in chunks of size fChunkSize;
So I try doing the following:
sparse->GetNChunks() * sparse->GetChunkSize()
Which, however, does not provide plausible results. For example, I see 2 chunks of size 16k for a sparse integer histogram with 8 dimensions, 32k bins each, and 32k randomly distributed entries (they are random, I’ve checked). On the other hand, I would expect it to use at least 32k (entries) * (4 B (integer)+ 16 * 8 (space for coordinates)).
Would you have an idea what I am wrongly assuming? Is there the correct way to compute the absolute memory size of a sparse histogram?
This is how I create the histogram, maybe I do that incorrectly:
const size_t bins =32768; const size_t dim = 8; const size_t entries = 32768; const Double_t min = 0.0; const Double_t max = 1000000.0; const std::vector<Int_t> binsDims(dim, bins); const std::vector<Double_t> mins(dim, min); const std::vector<Double_t> maxs(dim, max); auto* h = new THnSparseI("test", "test", dim, binsDims.data(), mins.data(), maxs.data());
I will be grateful if you could give me some hints!
ROOT Version: v6.20.02
Compiler: GCC v7.3.0