The IceCube low energy extension DeepCore observes GeV-scale atmospheric neutrino events that only illuminate a small fraction of its photosensors. This makes event reconstruction challenging, and this challenge is magnified by the complex neutrino event topologies and the difficulty in modeling Cherenkov photon propagation in an inhomogeneous medium. These effects inhibit the use of analytic modeling, requiring instead detailed simulations to establish the expectation values for each event hypothesis.
Current likelihood-based reconstruction methods approximate the light propagation in the detector based on simulations using approximate symmetry assumptions. Neural networks could in principle improve this approximation but the inhomogeneous detector geometry can not be handled by many network architectures.
In the near future, the IceCube Upgrade will be constructed with several more densely instrumented strings added to the existing array. This will collect more light from GeV-scale events, but will complicate current methods used to approximate the light expectation at the sensors.
Here we introduce a hybrid machine-learning likelihood approach to reconstruct DeepCore events. The high flexibility of this approach makes it easily transferable to the Upgrade or in fact any detector geometry. In addition, it is about 100 times faster than current reconstructions without sacrificing the benefits of using a likelihood, like estimating per-event parameter uncertainties.