Plot of nested beans

I am trying to plot what we call VMEC cuts with the method ROOT::RDF::RCsvDS method but unfortunately I get a lot of errors (including segmentation faults) and I cannot picture why. Are the graphs not allowed to close on themselves?

Attached you’ll find corresponding data.cross_vmec_beta485_ref_bean.txt (16.5 KB) VMEC_CUT.c (2.4 KB)

ROOT Version: 6.14
Platform: Ubuntu
Compiler: Clang

Usually these Error arises when the first line of the File does not exist of floating points with a single space in between. But unfortunately this is not the case, so I sadly have no clue what the problem could be.

sorry I started looking at this yesterday but forgot to reply.
It looks like a problem with the parsing in RCsvDS, but it will require a bit more time to figure out.

We will report back as soon as possible.


Hi again,
here’s the problem:

~/Downloads root -l                                                                                                                                                    (tvem) 
root [0] a = ROOT::RDF::MakeCsvDataFrame("cross_vmec_beta485_ref_bean.txt", false, ' ')
(ROOT::RDataFrame &) A data frame associated to the data source "CSV data source"
root [1] a.GetColumnType("Col0")
(std::string) "double"
root [2] a.GetColumnType("Col1")
(std::string) "std::string"

RCsvDS thinks your second column is a std::string – I don’t know why this happens (yet), but this is a bug, so I opened to track it.

Please ping us if you don’t hear back from us in a few days.


Thank you very much Enrico for your effort. It seems like nobody answered yet on the bug. Is there a way around it?


Hi Jim, this should get fixed “soon-ish” - @etejedor might be able to share his estimate?

Dear Jim,
I will try to find some time next week to have a look. Apologies for the wait.

Hi Everybody,

sadly I have a time problem attached to the plotting problem. The referees want to have the correction of my paper not later then the 25th of September. Is it possible to fix the bug until a day before or maybe supply myself with an alternative solution?

Kind Regards

Just to be sure. This Problem is the same bug isn’t it?
POINCARE.c (1.4 KB) Points_Stoch.txt (877.9 KB)

Hi Jim,

I had a quick look, my guess is that your second column is not being parsed as a double because it has values with this exponential syntax: -7.6076468e-16, which is perfectly fine, it is just not supported by the current implementation. I will add support for it when I have some time.

In the meantime, what you can do as a workaround is have an extra Define in your RDataFrame chain that transforms your string column into a double column.That Define should receive a lambda function that accepts an std::string as parameter, transforms it into a double and returns that double.

Hi etejedor,

sadly this is not true. I deleted the lines, which included a number of the form e-16 but the segmentation fault is still present. Check for yourself
cross_vmec_beta485_ref_bean.txt (14.8 KB)

as an attempt to unblock you as long as the bug is there, here’s an implementation of Enric’s suggestion:

auto df = ROOT::RDF::MakeCsvDataFrame("cross_vmec_beta485_ref_bean.txt", false, ' ')
auto df2 = df.Define("Col1Double", "stof(Col1)")

where stof converts Col1 (wrongly read in as a std::string) to a float column called Col1Double that you can use to fill histos.

Hope this helps!


The issue is that the .txt file was generated in Windows. When reading from the file in Linux, std::getline returns a string ended in \r. That \r is parsed as part of the second column, and therefore the inferred type is string. You can convert your Windows file to unix format with e.g. dos2unix.

Also, I noticed that the first row contains a 0 in the second column. Provided that you convert the file to unix format, and since it is the first row that is used to infer the types, the second column will be inferred as an integer. You can change that just by changing that value to 0. to force the infererred type to be double.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.