LoadMacro remote file (http)

Hi! I would like to load a remote macro but i would like to know if there is a ROOT way to do it (ROOT way meaning not gSystem->Exec wget or curl and then LoadMacro)
Thank you!

Hi Adrian,

I think we do not support that feature…

Cheers,
D

Would be possible to add? Something like TWebFile but not only for root and without requiring some (inaccessible) apache extension to access the file and then making LoadMacro and related functions access remote files?

And maybe in the future, some root implementation of git that would make possible a kind of module mechanism a la go?
Thank you!

Just use Go, then :slight_smile:

You can already read a fair amount of ROOT files (natively) and you have access to gonum.org, a fair approximation of numpy/scipy for Go. And, there’s Go-HEP.org for the rest :slight_smile:

I’d be interested to know what’s missing for your use case (besides writing ROOT files from Go)

I cannot/(have no incentive to) rebase my analysis on go (at best would be to have go load a lot of root based libraries and call root functions - there is no gain in this…
I was/am trying to load a macro (or more) directly from github … this way i can have some kind of modules for my analysis and decrease the number of files packed for a job

at best would be to have go load a lot of root based libraries and call root functions - there is no gain in this…

I wasn’t implying this. There’s (barely!) only C++ that can load/link to C++. Using C++ from any other language usually implies writing C shims libraries (because there’s no standard ABI), and it’s a sub-par experience (for a number of reasons) in many languages (not for all of them.) That’s definitely the case for Go.
Besides: once you’ve tasted the quick development cycle of a pure-Go toolchain, great deployment (static libraries FTW!) stories, builtin concurrency, builtin reflection, etc… It’s hard to go back to something like C++ or python :slight_smile:

this way i can have some kind of modules for my analysis and decrease the number of files packed for a job

retrofitting modules in a 40+ years language that’s still based on #include for so many years is, IMO, asking for a lot of pain down the line.
also: basing an analysis pipeline on having a reliable access to many remote source files is probably asking for troubles. Github is relatively reliable (and when it’s down it’s quickly back up) but it doesn’t strike me as a robust solution. Perhaps a convenient one, for sure. But I wouldn’t paint it as robust, also probably a bit questionnable, reproducibility-wise: making sure you get a consistent set of foobar.C macros across runs could be quite a hassle (doable if you write somewhere the SHA1 of the repo for the file(s) you HTTP-GET, but still quite painful to reconstruct and track after the facts.)

anyways, apologies for hijacking the thread.

PS: just poking some fun at ROOT (all in good spirit): it’s a bit ironic that the library/toolkit/framework born at CERN has still no support - in 2018 - for a good library for HTTP (something “vague, but exciting”… :stuck_out_tongue: )

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.