Back to Search Start Over

Transformers for Generalized Fast Shower Simulation.

Authors :
Raikwar, Piyush
Cardoso, Renato
Chernyavskaya, Nadezda
Jaruskova, Kristina
Pokorski, Witold
Salamani, Dalila
Srivatsa, Mudhakar
Tsolaki, Kalliopi
Vallecorsa, Sofia
Zaborowska, Anna
Source :
EPJ Web of Conferences. 5/6/2024, Vol. 295, p1-8. 8p.
Publication Year :
2024

Abstract

Recently, transformer-based foundation models have proven to be a generalized architecture applicable to various data modalities, ranging from text to audio and even a combination of multiple modalities. Transformers by design should accurately model the non-trivial structure of particle showers thanks to the absence of strong inductive bias, better modeling of long-range dependencies, and interpolation and extrapolation capabilities. In this paper, we explore a transformer-based generative model for detector-agnostic fast shower simulation, where the goal is to generate synthetic particle showers, i.e., the energy depositions in the calorimeter. When trained with an adequate amount and variety of showers, these models should learn better representations compared to other deep learning models, and hence should quickly adapt to new detectors. In this work, we will show the prototype of a transformer-based generative model for fast shower simulation, as well as explore certain aspects of transformer architecture such as input data representation, sequence formation, and the learning mechanism for our unconventional shower data. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
21016275
Volume :
295
Database :
Academic Search Index
Journal :
EPJ Web of Conferences
Publication Type :
Conference
Accession number :
177902537
Full Text :
https://doi.org/10.1051/epjconf/202429509039