A multitude of ML tasks in particle physics, from unfolding detector effects to refining simulation and extrapolating background estimations, require mapping one arbitrary distribution to another. Several indirect methods have been developed to achieve this, such as classifier-based reweighting on a distribution level, or conditional generative models. However, training an ML model to perform a direct, deterministic mapping has long been a challenging prospect.
In this talk, I introduce the concept of Schrödinger Bridges, ML architecture closely related to Diffusion Models, which enables direct mapping of arbitrary distribution to arbitrary distribution. I demonstrate two implementation approaches with differing upsides and present state-of-the-art results applying Schrödinger Bridges to unfolding and refinement tasks.
Institutions