Histograms changing at random


I’ve recently started toying around with ROOT to look at the output of CORSIKA, a cosmic ray event simulator. CORSIKA generated some showers and put them in a TFile, containing TTree ‘particle’ with branches for each measurement. I’d like to compare how the muon momenta generated by CORSIKA differ depending on the generator used(DPMJET, QGSJET, EPOS, QGSJET2, SIBYLL). When I try to run the attached macro to draw the histogram of one generator four times (i.e. the same histogram) to make sure they are drawn right, I get four different histograms! Not only are the histograms different, but they seem to vary randomly, with the mean of he1 ranging between 2 and 5. So, I get an output that is somewhat correct, but varies for unknown reasons. Is there anything in the code that could be causing these random fluctuations?

I’m running ROOT v5.34/36 on OSX 10.11.3. My input is (using ACLiC):
.x /Users/Damien/Documents/analysiscode/muonreader.cpp+
DAT002206.root (378 KB)
muonreader.cpp (6.57 KB)

Seems to me there is a problem with the e…[] vectors. Some value seems smaller than DBL_MIN
like 2.122e-314 and DBL_MIN is 2.22507e-308. Looks like there is some precision issue.
To see that I did:

    printf("iterator = %d %g %g\n",iterator,DBL_MIN,DBL_MAX);
    for (long k=0; k<iterator; k++) {
        if (e[k]>DBL_MAX)  printf ("e[%ld]  = %g is greater than DBL_MAX\n",k,e[k]);
        if (e2[k]>DBL_MAX) printf ("e2[%ld] = %g is greater than DBL_MAX\n",k,e2[k]);
        if (e2[k]>DBL_MAX) printf ("e3[%ld] = %g is greater than DBL_MAX\n",k,e3[k]);
        if (e[k]<DBL_MIN)  printf ("e[%ld]  = %g is smaller than DBL_MIN\n",k,e[k]);
        if (e2[k]<DBL_MIN) printf ("e2[%ld] = %g is smaller than DBL_MIN\n",k,e2[k]);
        if (e2[k]<DBL_MIN) printf ("e3[%ld] = %g is smaller than DBL_MIN\n",k,e3[k]);

In this case I would check the type of the stored variable versus the type of your vector (it looks to me like a “double” versus “float” or “int” issue).

Thank you very much for looking into it, Pepe Le Pew and couet.

My apologies if I’m using incorrect terminology in the following; it should be clear that I’m quite new to ROOT by now :slight_smile: . The TFile is made by a package called COAST that reads the CORSIKA output and returns a .root file containing 5 trees.

When filling the branches, it uses the following (see attached TC2R.cc):

fShower->Branch(“EventID”, &fCurrentShower->EventID);

In the documentation I found that TTree::Branch takes const char * name, void * address, const char * leaflist, Int_t bufsize = 32000. (found at https://root.cern.ch/doc/master/classTTree.html#af5d6a0d390030b3e0e5c8aa8d99a0088 -> it’s a couple entries down)

The leaflist input specifies the pointer (e.g. double_t, float_t) and defaults to float_t if leaflist is left empty. Hence the branches are filled with floats, whereas they are read as doubles in muonreader.cpp. Could this be what causes the problem? Or should I be looking at incongruences between pointers in muonreader.ccp?
TC2R.cc (11.6 KB)

hard to test your code: it uses 3 (different) files
but you provide only one.
Anyway, using this one file all 4 histos should be identical.

The buggy place is here:

    int size1 = obslvl.size();
    int size2 = obslvl2.size();
    printf("size1 %d size2 %d", size1, size2);
    int iterator;
    if (size1>size2) {
        iterator = size1;

size1 = size2 = 11978
but you have only 2241 muons in your sample

so iterator should be e.size() = 2241
that means you fill random memory content

btw: iterator isnt a very mnemonic name for this variable

That did the job! Thank you Otto Schaile