Dear ROOT experts,
I wanted to have a display of a subset of a RDataFrame but at the same time have the event count of the whole RDataFrame. However, if I call Count()
on RDataFrame the Display()
calls ignore nRows
argument and prints the whole RDataFrame instead. Here’s a quick reproducer
import ROOT
df = ROOT.ROOT.RDataFrame(10).Define("x", "gRandom->Rndm()")
count = df.Count()
display = df.Display("x", 5)
display.Print()
it outputs
+-----+------------+
| Row | x |
+-----+------------+
| 0 | 0.99974175 |
+-----+------------+
| 1 | 0.16290988 |
+-----+------------+
| 2 | 0.28261781 |
+-----+------------+
| 3 | 0.94720108 |
+-----+------------+
| 4 | 0.23165654 |
+-----+------------+
| 5 | 0.48497361 |
+-----+------------+
| 6 | 0.95747696 |
+-----+------------+
| 7 | 0.74430534 |
+-----+------------+
| 8 | 0.54004366 |
+-----+------------+
| 9 | 0.73995298 |
+-----+------------+
I get the expected result from Dislpay()
if I don’t use the Count()
import ROOT
df = ROOT.ROOT.RDataFrame(10).Define("x", "gRandom->Rndm()")
display = df.Display("x", 5)
display.Print()
+-----+------------+
| Row | x |
+-----+------------+
| 0 | 0.99974175 |
+-----+------------+
| 1 | 0.16290988 |
+-----+------------+
| 2 | 0.28261781 |
+-----+------------+
| 3 | 0.94720108 |
+-----+------------+
| 4 | 0.23165654 |
+-----+------------+
Why does the Count()
affects the result of the Display()
? Is there a way around it?
Best regards,
Aleksandr
ROOT Version: 6.26/06
Platform: Ubuntu 20.04
Compiler: Precompiled