A novel algorithm to reconstruct events in a water Cherenkov detector

Mo Jia, Karan Kumar, Liam S. Mackey, Alexander Putra, Cristovao Vilela, Michael J. Wilking, Junjie Xia, Chiaki Yanagisawa, Karan Yang

Research output: Contribution to journalConference articlepeer-review

Abstract

We have developed a novel approach to reconstruct events detected by a water-based Cherenkov detector such as Super- and Hyper-Kamiokande using an innovative deep learning algorithm. The algorithm is based on a generative neural network whose parameters are obtained by minimizing a loss function. In the training process with simulated single-particle events, the generative neural network is given the particle identification (ID) or type, 3d-momentum (p), and 3d-vertex position (V) as the inputs for each training event. Then the network generates a Cherenkov event that is compared with the corresponding true simulated event. Once the training is done, for the given Cherenkov event the algorithm will provide the best estimate on ID, p, and V by minimizing the loss function between the given event and the generated event over ranges of input values of ID, p and V. The algorithm serves as a type of fast simulation for a water Cherenkov detector with a fewer number of assumptions than traditional reconstruction methods. We will show some of the algorithm's excellent performance in addition of the architecture and principle of the network.

Original languageEnglish (US)
Article number976
JournalProceedings of Science
Volume414
StatePublished - 2022
Externally publishedYes
Event41st International Conference on High Energy Physics, ICHEP 2022 - Bologna, Italy
Duration: Jul 6 2022Jul 13 2022

Bibliographical note

Publisher Copyright:
© Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND 4.0)

Fingerprint

Dive into the research topics of 'A novel algorithm to reconstruct events in a water Cherenkov detector'. Together they form a unique fingerprint.

Cite this