TESTING AND EVALUATION OF DATA ANALYTIC APPROACHES FOR NONPROLIFERATION

Year
2023
Author(s)
Dylan Anderson - Sandia National Laboratories
Scott L. Stewart - Oak Ridge National Laboratory
Alexei N Skurikhin - Los Alamos National Laboratory
Karl Pazdernik - Pacific Northwest National Laboratory
Joel Brogran - Oak Ridge National Laboratory
Nathan Martindale - Oak Ridge National Laboratory
Abstract

Advanced data analytics continue to promise faster machine-assisted analyst workflows, integration of vast troves of open-source and multimodal information, and discovery of new, previously undetected events of interest in international safeguards, arms control, and nonproliferation applications. Realizing these promises requires robust analytic validation and performance measurement, which is classically reliant on a validation dataset drawn randomly from available data and withheld during development for final assessment before deployment. However, these applications are characterized by high consequences, sparse data and rare events, and significant, often nonrandom, uncertainty. Under these considerations, a classic analytic validation often lacks real-world data complexities or relies on metrics that do not reflect how analytics may be used in support of monitoring or decision-making in practice. Furthermore, the inadvertent leakage of validation information into analytic development can inflate assessed performance characteristics, yielding overly optimistic expectations for success. Finally, analytics must be continually monitored during use to ensure they continue to meet expected performance requirements, as the data being analyzed may no longer be well represented by the data used for model training. This session highlighted several case studies detailing validation failures, lessons learned, and best practices for the development of advanced data analytics for safeguards, arms control, and nonproliferation applications as well as approaches to continually monitor analytics during production.