Writing large dataset

I have a problem writing a workspace containing only a RooDataSet and its variables.

I have 2 RooRealVars and many RooCategory (30+). I tried to decrease the number of categorization, and to do a scan of the time I need to write the workspace. I started with 5 variables (RooRealVars + RooCategories) up to 19.

Doing a straight line fit I get that I need 0.6 seconds / variable to write the workspace, but I need 0.06 seconds / variables to write the txt file. So to write the ROOT file I need 10x time to write the txt file. How can you explain it?

More imporatant when I reach 19 variables I get:

Error in <TBufferFile::WriteByteCount>: bytecount too large (more than 1073741822)
Error in <TBufferFile::WriteByteCount>: bytecount too large (more than 1073741822)
Error in <TBufferFile::WriteByteCount>: bytecount too large (more than 1073741822)
Error in <TBufferFile::WriteByteCount>: bytecount too large (more than 1073741822)
Error in <TBufferFile::WriteByteCount>: bytecount too large (more than 1073741822)
Error in <TBufferFile::WriteByteCount>: bytecount too large (more than 1073741822)

If I try to increase the number of categories (more than 19, I want to store ~ 50 variables) the program never ends. Still I am able to write the txt file.