I’m totally new to ROOT and I must say I’m a bit overwhelmed by the features and finally god totally lost
I have to analyse several thousand events, each consisting of an array of times an ion hit a detector.
some thing like:
e1: [ 1, 3, 6, 8, 9, 10, 13, 17, 20]
e2: [ 2, 4, 6, 8, 9, 10, 12, 15, 17]
My question is how to save the events in the tree and get a histogram, fit, … per event?
Any hint is welcome.
welcome to ROOT !
A good way to start is having a look to the ROOT primer: https://root.cern.ch/getting-started
For some “learning by doing”, you can have a look to the rich set of examples we provide: https://root.cern/doc/master/group__Tutorials.html
For your particular problem, you’ll need to parse your input file: what format are you dealing with? Can you already read the data in memory?
We can have a look to the rest of the steps once this is accomplished, it does not look like a hard task at all.
We don’t have the DAQ at the moment, so I don’t have any “real” data yet. I like to think about how to handle the data first, and design the DAQ accordingly. Turned out this is easier than to adapt the post-proc to a weired DAQ.
What I propose is to have a raw-data-file with a flat structure starting with a short header with the trigger/event-count and UTC-time, followed by the relative event-times ending with some “DAQ-stop” line. For each sample-run I’ll have one file, consisting of thousands of such blocks.
Does that sound realistic?
It’s very nice to hear that you don’t consider this as big problem … to me it is … at least in ROOT
I guess it is ok to write out a binary format if you discussed this within the DAQ community of your experiment and validated the procedure. I think the key here is to read these files in memory and store their content in some kind of C++ data tructures (e.g. std::vectors). This is the first step. Once this is accomplished, the ttree writing will be trivial.
Is the first step accomplished?
This is certainly manageable, we’ll make the DAQ in a way that vectors are available for post-processing.
I guess I’ll come back to this topic once we have some dummy data to tinker with.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.