Uncertainties in Enrichment Measurements: A Comparative Study of High-Purity Germanium Versus LaBr3(Ce) Gamma-Spectrometers

Year
2024
Author(s)
A. Berlizov - IAEA
K. Krzysztoszek - IAEA
R. Binner - IAEA
A. Lebrun - IAEA
Abstract

The effectiveness of the safeguards system strongly depends on the measurement uncertainties of non-destructive assay (NDA) methods and instrumentation used by safeguards inspectors and nuclear fuel cycle facility operators. As a rule, the smaller the measurement uncertainties, the higher the sensitivity of statistical tests to detect a potential diversion from the stock of declared nuclear material.  Uranium enrichment is one of the key parameters (along with the total material mass or volume, and elemental content) needed to determine the 235U mass in an item, e.g. a 30B or 48Y cylinder containing uranium hexafluoride (UF6). Measurements applied to UF6 cylinders often utilize high resolution gamma-ray spectrometers. In the last decade, the three instruments most commonly applied by Agency inspectors were an electrically cooled high-purity germanium (HPGe) system, a liquid nitrogen cooled version and a LaBr detector. In the course of the International Target Values (ITV) 2020 Project, the Agency’s uncertainty quantification methodology was applied to derive the measurement uncertainties for these systems. The analysis of the calculated uncertainties revealed substantial differences between the three instruments, which were explained by the peculiarities of their design and performance characteristics, such as collimation, energy resolution, efficiency and throughput. The results of the uncertainty evaluations and their interpretations will be presented in the paper.