Experimentelle Physik 5 Astroteilchenphysik

Permanent URI for this collection

Browse

Recent Submissions

Now showing 1 - 20 of 47
  • Item
    One for all
    (2024) Geyer, Felix; Rhode, Wolfgang; Franckowiak, Anna
    With the help of radio interferometry, humanity can probe the universe at the highest resolutions possible, enabling in-depth studies of the physical processes driving the known cosmos. In recent years, the need for an increased automatization of the analysis pipelines in radio interferometry arose, mainly due to the increased data rates of the currently used and the planned generation of radio interferometers. In several areas of science, this increased automatization is handled by deep learning techniques, namely neural networks. In this thesis, I introduce a comprehensive analysis pipeline comprising physics-based radio galaxy simulations and the analytical simulation of the measurement process of radio interferometers. Furthermore, several deep learning models are trained to reconstruct the simulated incomplete observations to obtain a cleaned version of radio galaxies. Utilizing several evaluation metrics, the training process is adapted and improved with regard to the goal of reconstructing a protoplanetary disk from the DSHARP data set. Additionally, the neural network approach is extended to include the estimated uncertainty of the prediction. Finally, I present a neural network-based reconstruction of the protoplanetary disk “Elias 24”.
  • Item
    Pulsar analyses with the MAGIC experiment
    (2024) Schubert, Jan Lukas; Elsässer, Dominik; Delitzsch, Chris
    Pulsars are fast-rotating neutron stars that emit pulsed electromagnetic radiation. In particular, the discovery of pulses with energies above 100 GeV raises new questions about the underlying emission mechanisms. So far, only three pulsars in this energy regime are known. The MAGIC telescopes are sensitive from approximately 50 GeV, but can lower their enery threshold down to 20 GeV with optimized trigger criteria (the Sum-Trigger-II) and thus become suitable instruments for the observation of high-energy pulsars. In this work, the Crab and Dragonfly Pulsars are studied. Due to the large amount of data recorded over more than ten years, the analysis is automated. The basis for this is autoMAGIC, a database-supported tool that enables reproducible analyses with minimal human interaction. The special requirements of pulsar analyses are implemented in autoMAGIC and a standardized dataset for the subsequent production of physical results is generated. In recent years, great efforts have been made in gamma astronomy to standardize the data across different telescopes and to analyze them using unified software. In this context, the open-source project Gammapy was developed. As part of this work, the Python package magicpulsar was developed, which computes the pulsar timing and large parts of the Gammapy analysis based on the standardized dataset from autoMAGIC. The Crab Pulsar is clearly detected based on ∼480 h observation time with 15.47 𝜎 for peak P1, 24.30 𝜎 for peak P2, and 8.85 𝜎 for the bridge. The energy spectra for these peaks and other physically interesting phase ranges are generated and agree with the Fermi-LAT results. A decreasing P1/P2 ratio between ∼20 GeV and ∼800 GeV is observed. The analysis of 27 h of Dragonfly data delivers no pulsed signal.
  • Item
    Saving time and money for Monte Carlo
    (2024) Dominik, Rune Michael; Rhode, Wolfgang; Tjus, Julia
    The Cherenkov Telescope Array Observatory (CTAO) will be the next-generation ground-based very-high-energy (VHE) gamma-ray observatory once its construction and commissioning are finished. Like its predecessors, CTAO relies on Instrument Response Functions (IRFs) to relate the observed and reconstructed properties to the true ones of the primary gamma-ray photons and thus reconstruct spectral and spatial information of the observed sources. As IRFs are derived from Monte Carlo simulations and depend on observation conditions like telescope pointing and atmospheric transparency, producing a complete set of IRFs is a time-consuming task and not feasible when analyzing data on short timescales. To facilitate the production of optimized IRFs in such scenarios, this work studies the use of inter- and extrapolation algorithms to quickly compute IRFs from a pre-computed grid for the Large-Sized Telescope prototype (LST-1) using the pyirf python software package. As some constituents of an IRF are given as probability distributions, specialized methods are needed. Using 35.9 hours of LST-1 Crab Nebula observation taken with zenith angles up to 35 degree, this thesis shows the compatibility of estimated IRFs and a nearest neighbor approach on the provided LST-1 simulation grid. When using sparser grids, estimated IRFs maintain a stable performance well beyond the point where the nearest neighbor approach can no longer yield reasonable results. Applying estimated IRFs to observations of NGC 1275 from December 2022 and January 2023 in the same zenith range shows clear signs of two flares in this period, matching the signature obtained from past events. Estimated IRFs present themselves to be fully capable of being used with LST-1 analyses in a zenith range of up to 35 degree.
  • Item
    We are number one
    (2024) Nickel, Lukas; Rhode, Wolfgang; Albrecht, Johannes
    Eine Vielzahl von Gammastrahlungsquellen wurde milthilfe der aktuellen Genera- tion von Imaging Atmospheric Cherenkov Telescopes (IACTs)-Experimenten gefunden. Dennoch verbleiben fundamentale Fragen offen: Wie werden die Teilchen der kosmischen Strahlung beschleunigt? Was können wir über die extremen Umgebungen nahe von schwar- zen Löchern und Supernovae lernen? Wird die Dunkle Materie jemals gefunden werden? Um diese Fragen zu beantworten, werden sowohl Experimente benötigt, die um ein Vielfaches sensitiver als die derzeit verfügbaren sind, als auch neue Methoden der Daten- analyse. Zu diesem Zweck wird derzeit ein neues Observatorium errichtet, das Cherenkov Telescope Array Observatory (CTAO). Der erste Teleskopprototyp, das Large-Sized Te- lescope Prototype (LST-1) auf La Palma wurde 2018 eingeweiht. Seitdem nimmt es als Teil des Inbetriebnahmeprozesses Daten auf und bringt erste wissenschaftliche Erkenntnisse hervor während die weiteren Teleskope errichtet werden. In dieser Arbeit werden Daten aus Beobachtungen der Radioquelle M87 analysiert, die von dem LST-1 zwischen April 2021 und Januar 2024 aufgenommen wurden. Dazu kommen die low-level Software des LST-1, lstchain, und das CTAO Science Tool gammapy zum Einsatz. Eine dreidimensionale Analyse wird entwickelt und Modelle für die Untergrund- abschätzung erstellt. Diese zeigen, dass die Annahme radialer Symmetrie im Allgemeinen nicht erfüllt ist und die Detektorantwort in einer allgemeineren Form beschrieben werden sollte. Der gemessene Überschuss aus der Richtung von M87 ist nicht inkompatibel mit der Untergrunderwartung, sodass eine Detektion der Quelle nicht möglich ist. Allerdings ergibt eine Modellierung des Quellüberschusses Ergebnisse, die kompatibel zu früheren Messungen von M87 während Phasen niedriger Aktivität sind.
  • Item
    Enabling next-generation particle cascade simulations
    (2024) Alameddine, Jean-Marco; Rhode, Wolfgang; Kröninger, Kevin
    The observation of cosmic messenger particles provides unique insights into astrophysical processes. As high-energy nuclei or photons reach the Earth's atmosphere, a particle cascade called extensive air shower is initiated. By detecting secondary shower particles, the energy, direction, and identity of the initial messenger particle can be reconstructed. This task heavily relies on an accurate simulation of particle cascades. A major challenge in this context is the muon puzzle – a significant, as yet unresolved muon deficit in air shower simulations compared to experimental observations. To overcome limitations of existing shower simulation codes, the next-generation simulation framework CORSIKA 8 is currently under development. This work describes the implementation of the Monte Carlo simulation code PROPOSAL as the first electromagnetic and muonic interaction model in CORSIKA 8. PROPOSAL describes muon interactions with maximal precision, minimizing possible systematic uncertainties in the description of the muonic shower component in CORSIKA 8. The code structure of PROPOSAL is modularized, and a description of electromagnetic processes is implemented. An interface between CORSIKA 8 and PROPOSAL is written, which is validated by comparisons with previous CORSIKA versions. Notably, these validations reveal that CORSIKA 8 shows a 5% increase of muons in hadronic showers. As this number is insufficient to solve the muon puzzle, inaccuracies in muon propagation are ruled out as a possible cause. Furthermore, this work allows for the first physics-complete shower simulations with CORSIKA 8, which is a crucial step toward its first release. For the scientific community, CORSIKA 8 is going to be a powerful tool for the simulation and investigation of particle cascades, especially to further understand and solve the muon puzzle.
  • Item
    Optical photon emission in extended airshowers
    (2023) Baack, Dominik; Rhode, Wolfgang; Kröninger, Kevin
    With the motivation to improve experimental gains and precision, established astroparticle experiments are currently undergoing massive upgrades. In addition, several new experiments are being built or planned. With the resulting gain in observational quality, the amount and accuracy of simulated data required for the analysis is also rising. In order to meet the increasing requirements and complexity due to the experiments’ growth and to provide a unified software ecosystem, it was decided to re-develop the de facto standard extensive air shower simulation CORSIKA completely in C++ based on the original Fortran code. Since one of the largest runtime consumers is the propagation of millions of optical Cherenkov and fluorescence photons, and many experiments are starting to use them for measurements, it was decided to develop hardware-accelerated code to speed up the simulation. Specific methods have been developed to propagate photons on deep learning acceleration hardware similar to classical GPUs to take additional advantage of the current and future growth of the deep learning sector. In particular, Nvidia accelerators were tested.
  • Item
    Atmospheric seasoning
    (2024) Hymon, Karolin; Rhode, Wolfgang; Delitzsch, Chris Malena
    Besides the detection of astrophysical neutrinos, atmospheric neutrinos from cosmicray- induced air showers are detected at unprecedented statistics with the IceCube Neutrino Observatory. The conventional component of the atmospheric neutrino flux is produced in decays of kaons and pions. Due to seasonal changes in the atmospheric temperature, the neutrino flux undergoes a seasonal variation. When the temperature increases, the atmosphere expands, and more neutrinos are expected to be produced. Additionally, the seasonal variation increases with energy, as parent particles interact at higher altitudes in the atmosphere, where seasonal temperature variations are larger. The interaction cross section increases with energy and the probability for the parent meson to decay increases. The investigation of seasonal variations serves as an accurate background determination in the search for astrophysical neutrinos and the study of hadronic interactions in atmospheric particle cascades. In this thesis, seasonal variations in the atmospheric neutrino flux are measured energydependently for the first time based on 11.5 years of IceCube data. The determination of the neutrino energy presents an ill-conditioned inverse problem, requiring to infer the energy from measured detector quantities. This challenge is addressed by the Dortmund Spectrum Estimation Algorithm (DSEA+), which utilizes machine learning methods to unfold the neutrino energy. The determined variation strength is compared to theoretical predictions from MCEq, and in particular to the calculation with the atmospheric model NRLMSISE-00.
  • Item
    Unfolding the muon neutrino flux
    (2023) Kardum, Leonora; Rhode, Wolfgang; Westphal, Carsten
    The IceCube Neutrino Observatory, situated at the South Pole within a cubic kilometer of underground ice, is a state-of-the-art experiment for detecting particles of high energies, with a special focus on investigating neutrino physics. The neutrino flux can be divided into three distinct components: astrophysical, originating from extraterrestrial sources; conventional, arising from the decay of pions and kaons in atmospheric cosmic ray cascades; and the prompt component, which has yet to be detected and stems from the decay of charmed hadrons. This study aims to reconstruct the total flux of neutrinos at Earth and places a particular emphasis on examining the predicted angular dependence. Unfolding encompasses a collection of techniques that aim to determine a quantity in a manner independent of specific assumptions, thereby removing the influence of various assumptions made during the process. In this analysis, the energy spectrum of muon neutrinos is unfolded with the employment of an innovative technique for reshaping the observable space to ensure an adequate number of events in the low statistic region at the highest energies. This work presents the unfolded energy and zenith angle spectrum reconstructed from eleven years of IceCube data in the range from 500 GeV to 4 PeV energies, and compares the findings with both model predictions and previous measurements.
  • Item
    Spectral and spatial analysis of MAGIC telescope data in a standardized format
    (2023) Mender, Simone; Rhode, Wolfgang; Delitzsch, Chris Malena
    The precise understanding of the emission and acceleration processes of very-high-energy radiation in the Universe is still an unsolved mystery today. To study the nature of very-high-energy gamma rays, Imaging Air Cherenkov Telescopes such as the MAGIC telescopes detect Cherenkov light produced by particle showers in the atmosphere. State-of-the-art spectral and spatial analyses of gamma-ray data rely on the open-source Python package Gammapy. Due to this new approach from the gamma-ray community, the input data needs to be provided in a standardized format which requires a combination of event lists with instrument response functions. In this thesis, a spectral analysis of the two TeV radio galaxy candidates TXS 0149+710 and 4C +39.12 observed by MAGIC is conducted. For this, standardized data is produced in an automated and reproducible way using the new database-driven tool AutoMAGIC, partly developed in the course of this thesis. Li&Ma significances of 0.32 𝜎 and 0.98 𝜎 are calculated for TXS 0149+710 and 4C +39.12, respectively. Therefore, only upper limits on the differential flux are given. For spatial analyses, background models have to be included in the standardized data, which is not covered by AutoMAGIC yet. To address this challenge, 1441 observations of off data are processed with AutoMAGIC and the background shape is characterized depending on the azimuth, zenith distance, and the reconstructed energy. Also, dependencies of the background rate on the zenith distance, the transmission of the atmosphere, the NSB and the galactic latitude are investigated. A new method is developed, which creates background models according to the new- found relations with the azimuth and zenith distance. These background models are compared with background models created from non-simultaneous off data with more conventional methods. Spectral and spatial analyses of Crab Nebula data are performed to validate the background methods.
  • Item
    Observation of high-energy neutrinos from the Milky Way
    (2023) Hünnefeld, Mirco; Rhode, Wolfgang; Tjus, Julia
    With the discovery of the astrophysical neutrino flux by IceCube in 2013, the foundation for neutrino astronomy was established. In subsequent years, the first point-like neutrino source candidates, a flaring blazar known as TXS 0506+056 and the active galaxy NGC 1068, emerged. Now, in this work, a new milestone in the rising field of neutrino astronomy is presented: the first observation of high-energy neutrinos from our own Galaxy — the Milky Way. A search for Galactic neutrino emission is performed on 10 years of IceCube data, rejecting the background-only hypothesis at the 4.5σ significance level. The observed Galactic neutrino flux, believed to originate from diffuse interactions of cosmic rays, possibly in addition to contributions from unresolved point-like sources, may explain up to 10% of the astrophysical neutrino flux previously measured by IceCube. This observation is enabled by novel tools based on deep learning, developed in this dissertation. In comparison to prior IceCube analyses, the sensitivity is improved by a factor up to four, due to improved event reconstructions and an increased effective area by over an order of magnitude. These tools not only lead to the most sensitive neutrino dataset to date in the Southern Sky, but they also enable a wide variety of future applications and analyses that were previously unattainable.
  • Item
    Unfolding the spectrum of the blazar Markarian 421 using data from the first Large-Sized Telescope of the Cherenkov Telescope Array
    (2023) Biederbeck, Noah; Rhode, Wolfgang; Albrecht, Johannes
    The nature of acceleration processes of very-high-energy radiation in the universe is largely unsolved. To study these, Imaging Atmospheric Cherenkov Telescopes measure Cherenkov radiation produced by secondary particles in Extensive Air Showers. The Cherenkov Telescope Array (CTA) is the next generation of ground-based Cherenkov astronomy to be built in the coming years parts on La Palma, Spain and parts at Paranal Observatory, Chile. The first prototype telescope of CTA, the LST-1, was inaugurated in 2018 on La Palma, and has been in the commissioning phase since. In this thesis, I develop a new automated analysis pipeline for analyses of point-like gamma-ray sources for LST-1 observations, using the workflow manager snakemake. I implement regularized unfolding in the open-source high-level gamma-ray analysis software Gammapy. I use the analysis pipeline and the unfolding implementation to analyze data of the blazar Markarian 421, creating an unfolded energy spectrum. The unfolded energy spectrum calculated in this thesis is compared to earlier analyses. It extends the measured range of the multi-wavelength energy spectrum of Markarian 421 to the highest energies.
  • Item
    Hadronic accelerators in the universe
    (2023) Fattorini, Alicia; Rhode, Wolfgang; Westphal, Carsten
    The search for the origin of charged cosmic rays remains one of the greatest challenges in astrophysics. Extremely accelerated particles propagate through the universe carrying the secrets of the most energetic cosmic phenomena. While neutral particles are not deflected by magnetic fields and point back to their sources, charged cosmic rays arrive on Earth as a diffuse flux, making it nearly impossible to identify their origin. The MAGIC telescopes, primarily designed to detect high-energetic gamma rays, also have the potential to study charged cosmic rays. This work presents the analysis chain to produce a proton spectrum from data measured with the MAGIC telescopes. The analysis chain includes data preparation, machine learning algorithms for particle reconstruction, and unfolding techniques which consider remaining background contributions. New simulations of air showers induced by charged cosmic rays are used in this analysis and tested accordingly. This work illustrates the potential of IACTs for the research of charged cosmic rays and provides the first proton spectrum of MAGIC, which constitutes a valuable addition to previous measurements by other cosmic-ray experiments.
  • Item
    The Muon Puzzle in cosmic-ray induced air showers and its connection to the Large Hadron Collider
    (2022-03-09) Albrecht, Johannes; Cazon, Lorenzo; Dembinski, Hans; Fedynitch, Anatoli; Kampert, Karl-Heinz; Pierog, Tanguy; Rhode, Wolfgang; Soldin, Dennis; Spaan, Bernhard; Ulrich, Ralf; Unger, Michael
    High-energy cosmic rays are observed indirectly by detecting the extensive air showers initiated in Earth’s atmosphere. The interpretation of these observations relies on accurate models of air shower physics, which is a challenge and an opportunity to test QCD under extreme conditions. Air showers are hadronic cascades, which give rise to a muon component through hadron decays. The muon number is a key observable to infer the mass composition of cosmic rays. Air shower simulations with state-of-the-art QCD models show a significant muon deficit with respect to measurements; this is called the Muon Puzzle. By eliminating other possibilities, we conclude that the most plausible cause for the muon discrepancy is a deviation in the composition of secondary particles produced in high-energy hadronic interactions from current model predictions. The muon discrepancy starts at the TeV scale, which suggests that this deviation is observable at the Large Hadron Collider. An enhancement of strangeness production has been observed at the LHC in high-density events, which can potentially explain the puzzle, but the impact of the effect on forward produced hadrons needs further study, in particular with future data from oxygen beam collisions.
  • Item
    Another one cleans the dust
    (2022) Schmidt, Kevin; Rhode, Wolfgang; Bomans, Dominik J.
    Radio interferometers achieve the highest resolutions at the cost of sparse data coverage. Incompletely sampled sky distributions in Fourier space result in noise artifacts in the source reconstructions. Established cleaning software is often time-consuming and lacks reproducibility. In this work, I propose a novel cleaning strategy for radio interferometer data based on convolutional neural networks to adjust current analysis strategies to the new telescope standards. This deep learning-based approach will allow a straightforward application that generates reproducible results with short reconstruction times. The newly developed simulation chain enables the simulation of Gaussian radio galaxies and mimics observations by radio interferometers. By iterative adjustments, complexity is increased, ending up with a simulated data set comparable to MOJAVE archive data. In parallel, the deep learning framework radionets, capable of uncertainty estimates, is built to analyze large data samples with comparable characteristics. The improved reconstruction technique will allow scientists to focus more on their scientific analysis and omit a vast workload on data cleaning tasks. Various evaluation techniques are created to quantify the trained deep learning models' reconstruction quality. Furthermore, the reconstruction performance is assessed on input data with different noise levels by comparing the resulting predictions with the simulated source distributions. Source orientations and sizes are well reproduced, while the recovered intensities show substantial scatter, albeit not worse than existing methods without fine-tuning. Finally, all improvements are combined to train a deep learning model suitable to evaluate MOJAVE observations.
  • Item
    Multiwavelength analysis of the TeV-radio galaxy 3C 84/NGC 1275
    (2021) Linhoff, Lena Marie; Rhode, Wolfgang; Kröninger, Kevin
    The radio galaxy 3C 84 is a well-studied source of radio emission and was detected as NGC1275 also in the MeV/TeV regime by gamma-ray detectors like MAGIC and Fermi-LAT. It is still unclear where and how the gamma-ray emission is produced. In this thesis, I will confine possible emission sites and exclude the region near the black hole as the origin of the gamma-ray production. For this aim, I investigate the optical depth of the broad-line region using data published by MAGIC and Fermi-LAT. Furthermore, a cross-correlation study is performed to find a possible correlation between the light curves of the two radio components in 3C84 detected by the VLBA and the gamma-ray light curve measured by Fermi-LAT. A significant correlation between the core component and the gamma-ray emission is found, which is in line with the results I derive from analyzing the optical depth of the broad-line region. For the first time, I perform a long-term analysis of NGC1275 for four years of MAGIC data, which reveals a short flare at the beginning of 2017 and a very low state of activity since then. To perform this long-term analysis, the software framework autoMAGIC was developed in the course of this thesis. autoMAGIC enables fully automatic and reproducible analyses of long-term data and can be used for the automatic processing of MAGIC data in the future.
  • Item
    Systematic uncertainties of high energy muon propagation using the leptonpropagator PROPOSAL
    (2021) Soedingrekso, Jan Benjamin; Rhode, Wolfgang; Spaan, Bernhard
    Muons are the dominant particle type measured in almost every underground experiment mainly driven by the high production rate of muons in cosmic-ray induced air showers as well as the long muon range. Due to their stochastic propagation behavior, they can remain undetected with minimal energy losses in veto regions while producing a signal-like signature with a large stochastic energy loss inside a detector. Therefore, accurate description of theoretical models and precise treatment in simulations as well as a validation of the cross-section with measurements are required. In this thesis, systematic uncertainties in simulations of high-energy muons were analyzed and improved, which can be divided into three parts. The theoretical models of the cross-sections were revised and radiative corrections for the pair production interaction were calculated. In a next step, the Monte-Carlo simulation library PROPOSAL was completely restructured in a modular design to include more accurate models and corrections. Due to its improved usability through the modular design and its accessibility as free open-source software, PROPOSAL is now used in many applications, from large simulation frameworks, such as the CORSIKA air shower simulation, to small simulation studies. The third part consisted of a feasibility study using PROPOSAL to measure the bremsstrahlung cross-section from the energy loss distribution, which can be measured in cubic kilometer-sized detectors. For a detector resolution similar to that of the IceCube neutrino telescope, the bremsstrahlung normalization was estimated with an uncertainty of 4%.
  • Item
    Bad moon rising?
    (2020) Buß, Jens Björn; Rhode, Wolfgang; Spaan, Bernhard
    This dissertation presents a study on the influence of the night sky background (NSB) on the performance of the First G-APD Cherenkov Telescope (FACT), which is the first imaging atmospheric Cherenkov telescope (IACT) with a silicon photomultiplier (SiPM) camera. The up till now state-of-the-art, photomultiplier tubes (PMTs), can be easily and severely damaged by bright moonlight. SiPMs are an alternative robust photon detector for IACTs allowing for maximization of observation time by extending towards extreme NSB conditions, e.g., direct full moonlight. The performance has been determined on observations of the Crab Nebula from winter 2015/16 for all observed NSB levels. Dedicated Monte Carlo simulations have been tailored to the light conditions of the data sets by a new approach superimposing NSB measurements and simulated extensive air showers. The used analysis chain features machine-learning and unfolding techniques to reconstruct the energy spectrum and has been optimized in the course of this study for various NSB levels. The dedicated Monte Carlo simulations are used to train machine-learning models and allow for evaluation of their performance and dependency on the NSB. In preparation for this analysis, a procedure to select optimum cleaning levels for the observed light conditions needed to be improved as introduced in this thesis. With these enhancements, the typical performance metrics for an IACT are evaluated. The Crab Nebula has been detected with significances of ≈( 5 σ)/√h up to an NSB level twelve times brighter than the darkest nights. At even higher NSB level, the source could still be detected with a significance of ≈ (3.4 σ)/√h at NSB levels corresponding to direct moonlight at a 60% lunar phase. Furthermore, it has been shown that the integral sensitivity of FACT degrades with the NSB level from 10% to 20% of the Crab flux necessary for detection with 5𝜎 significance in 50h effective observation time. The main effect of rising NSB has been identified as an increase of the energy threshold, which has also been evident in a shifted low-energy edge of the effective collection area as well as its general decline with the NSB. The Crab Nebula energy spectrum has been successfully reconstructed in an energy range of 450 GeV to 30 TeV for various NSB levels up to these light conditions. According to the rising energy threshold, the lower edge of the unfolded energy range had to be increased up to 600 GeV in order to unfold the energy spectrum at higher NSB levels. Other than the above-mentioned findings, no significant indication for systematic effects on the unfolded spectra have been found. This work shows furthermore that with these adjustments the spectra are still in good agreement with each other and also with reference spectra from MAGIC and First G-APD Cherenkov Telescope (FACT). In summary, the performance values show promising results for observations with FACT at increased NSB levels. They underline the use of SiPMs as a potential alternative to other approaches for extending IACT observation times to bright light conditions.
  • Item
    Monitoring the high energy universe
    (2020) Nöthe, Maximilian; Rhode, Wolfgang; Westphal, Carsten
    High energy gamma-ray astronomy probes the most extreme phenomena in our universe: super novae and their remnants as well as supermassive black holes at the center of far away galaxies. The First G-APD Cherenkov Telescope (FACT) is a small, prototype Imaging Air Cherenkov Telescope (IACT) operating since 2011 at the Roque de los Muchachos, La Palma, Spain. It specializes in continuously monitoring the brightest known sources of gamma rays. In this thesis, I present a new, open analysis chain for the data recorded by FACT, with a major focus on ensuring reproducibility and relying on modern, well-tested tools with widespread adoption. The integral sensitivity of FACT was improved by 45 % compared to previous analyses by the introduction of an improved algorithm for the reconstruction of origin of the gamma rays and many smaller improvements in the preprocessing. Sensitivity is evaluated both on simulated datasets as well as observations of the Crab Nebula, the “standard candle” of gamma-ray astronomy. Another major advantage of this new analysis chain is the elimination of the dependence on a known point source position from the event reconstruction, thus enabling the creation of skymaps, the analysis of observations where the source position is not exactly known and sharing reconstructed events in the now standardized format for open gamma-ray astronomy. This has lead to the first publication of a joined, multi-instrument analysis on open data of four currently operating Cherenkov telescopes. A smaller second part of this thesis is concerned with enabling robotic operation of FACT, which is now the first Cherenkov telescope, where no operators are required during regular observations.
  • Item
    Unmasking the gamma-ray sky: comprehensive and reproducible analysis for Cherenkov telescopes
    (2019) Brügge, Kai; Rhode, Wolfgang; Kröninger, Kevin
    Imaging atmospheric Cherenkov telescopes (IACT) observe the sky in the highest energy ranges. From the remnants of cataclysmic supernovae to jets powered by supermassive blackholes in the center of distant galaxies, IACTs can capture the light emerging from the most extreme sources in the universe. With the recent advent of multi-messenger astronomy it has become critical for IACTs to publicly share their data and software. For the first time since the inception of IACT technology, in a combined effort of the H.E.S.S., MAGIC, VERITAS, and FACT collaborations, observations of the Crab Nebula were made available to the general public in a common data format. The first part of my thesis demonstrates the viability of the common data format by performing a spectral analysis of the Crab Nebula on the published datasets. The text gives detailed descriptions and mathematical formalizations of instrument response functions (IRFs) and the statistical modeling used for typical spectral analyses. This is essential to understand the measurement process of IACTs. The ultimate goal of this part of the thesis will be to use Hamilton Markov Monte Carlo methods to test spectral models and unfold flux-point estimates for the Crab Nebula. The common data format paves the road for the operation of the upcoming Cherenkov Telescope Array (CTA). Once CTA has been constructed, it will be the largest and most sophisticated experiment in the field of ground-based gamma-ray astronomy. It will be operated as an open observatory allowing anyone to access the recorded data. The second part of my thesis concentrates on reproducible analysis for the Cherenkov Telescope Array (CTA). Once operational, CTA will produce a substantial amount of data creating new challenges for data storage and analysis technologies. In this part of the thesis I use simulated CTA data to build a comprehensive analysis chain based on fully open-source methods. The goal is to create a pipeline that rivals the physics performance of CTA’s closed-source reference implementation. Every step of the analysis, from raw-data processing to the calculation of sensitivity curves, will be optimized with respect to complexity, reproducibility and run-time.
  • Item
    Search for astrophysical tau neutrinos using 7.5 years of IceCube data
    (2019) Meier, Maximilian; Rhode, Wolfgang; Spaan, Bernhard
    Astrophysical tau neutrinos are the last unidentified standard model messenger in astroparticle physics. Their identification can open new windows to neutrino physics, improve knowledge about cosmic neutrino sources and even test physics beyond the standard model. This work aims to constrain the tau neutrino component in the astrophysical neutrino flux observed by the IceCube Neutrino Observatory. Due to neutrino oscillations over cosmic baselines, a significant fraction of tau neutrinos is expected regardless of the exact neutrino production scenario at cosmic sources. The IceCube detector instruments a volume of 1 km3 to detect neutrinos interacting with the glacial ice at the South Pole at a depth between 1450 m and 2450 m. This is achieved by 5160 digital optical modules (DOMs), each equipped with a photomultiplier tube detecting Cherenkov light produced by secondary particles from neutrino interactions. In this dissertation, a new tau neutrino identification method is developed using state-of-the-art machine learning techniques to increase the expected tau neutrino event rate by a factor of 2.5 over previous work. Tau neutrinos are identified by the so-called double pulse signature, where two charge depositions can be observed in the waveform recorded in a single IceCube DOM: the first from the hadronic cascade induced by the neutrino interaction; the second one from a non-muonic decay of the produced tau lepton. This signature can be resolved by IceCube at energies above roughly 100 TeV. IceCube data recorded from 2011 to 2018 is analyzed and two tau neutrino candidates are observed. The astrophysical tau neutrino flux normalization is measured with a binned Poisson likelihood fit and the flux is observed to be 0.44+0.78 −0.31 10−18 GeV−1 cm−2 s−1 sr−1 at 100 TeV for an astrophysical spectral index of 𝛾 = 2.19. The observation is found to be incompatible with the non-observation of a tau neutrino flux at a significance of 1.9𝜎.