The surface detector of the Telescope Array (TA) experiment is the largest one in the northern hemisphere. We overview the machine learning based event reconstruction methods being developed by the TA collaboration. The key idea is to use full detector Monte Carlo simulation to obtain the raw detector signal as a function of the primary particle properties and to train deep convolutional...
Once again, the last several years reshaped the state-of-the-art in Computer Vision (CV). Non-convolutional approaches, such as Vision Transformers (ViT) and self-attention multi-layer perceptrons (SA-MLP), are quickly emerging, combined with novel optimization techniques and pre-training methods. Note that ViTs and SA-MLPs are evidently better at incorporating global information about the...
The measurement of the mass composition of ultra-high energy cosmic rays constitutes one of the biggest challenges in astroparticle physics. Detailed information on the composition can be obtained from measurements of the depth of maximum of air showers, Xmax, with the use of fluorescence telescopes, which can be operated only during clear and moonless nights.
Using deep neural networks, it...
We present a method based on the use of Recurrent Neural Networks to extract the muon component from the time traces registered with water-Cherenkov detector (WCD) stations of the Surface Detector of the Pierre Auger Observatory. With the current design of the WCDs it is not straightforward to separate the contribution of muons to the time traces from those of photons, electrons and positrons...
The Extreme Universe Space Observatory Super Pressure Balloon 2 (EUSO-SPB2) is under development, and will prototype instrumentation for future satellite-based missions, including the Probe of Extreme Multi-Messenger Astrophysics (POEMMA). EUSO-SPB2 will consist of two telescopes. The first is a Cherenkov telescope (CT) being developed to identify and estimate the background sources for future...
In order to properly train neural networks to analyze air shower data, it is necessary to have accurate simulations providing the necessary level of details required to extract the required information. The most popular tool is certainly the current version of CORSIKA and its fast option for 1D simulation CONEX. We will present the basic principles of these tools and how to use them properly....
The proliferation of innovative next-generation cosmic ray and neutrino observatories, with unique geometries (Earth-skimming, orbital, in-ice, etc.), and detection techniques (Cherenkov, radio, radar, etc.), requires the simulation of ultrahigh energy particle cascades which are challenging, if not impossible, to perform with current simulation tools like CORSIKA 7 and AIRES. These...
The use of computational algorithms, implemented on a computer, to extract information from data has a history that dates back to at least the middle of the 20th century. However, the confluence of three recent developments has led to rapid advancements in this methodology over the past 15-20 years: the advent of the era of large datasets in which massive of amounts of data can be collected,...