I wasn’t sure where to post for help with Magboltz, so I’m hoping the ROOT newbie forum will be gentle!
I am a PhD student working on the NEWS-G dark matter experiment at Queen’s University, Canada. I am studying the effect of electronegative contaminants (primarily oxygen) on our signals. NEWS-G has used Magboltz often during simulation work for our design of gaseous detector, giving reliable predictions. We find it valuable.
However, we find a discrepancy when oxygen is considered, with the predicted attachment coefficient from Magboltz being significantly higher than we seem to observe. Some experiments last week confirmed that when we see attachment affecting about 20% of our ionization electrons, Magboltz gives an attachment coefficient so high that almost all of them should be attached.
Our detector design means that we operate at unusually low electric field strengths, with a drift field of only ~1 V/cm and ~0.005 Td, and wonder whether this might be the central issue, with limited measurements of the oxygen attachment coefficient published by anyone in this regime. For instance, looking at Figure 5 left in the 2020 paper coauthored by Steve Biagi, Electron transport in gaseous detectors with a Python-based Monte Carlo simulation code, I notice that the lowest field strength data plotted (Jeon and Nakamura, 1998) is 3000 times larger, although I also notice that they measured down to 0.03 Td in Jeon and Nakamura, 1998. DOI: 10.1016/j.cpc.2020.107357.
I wondered whether anyone has any insights or suggestions that might throw some light on this apparent discrepancy. Thank you! (I tried emailing Prof Biagi, however the CERN email server rejected it.)