Cosmic Expansion History from Sne Ia Data Via Information Field Theory: the Charm Code

Total Page:16

File Type:pdf, Size:1020Kb

Cosmic Expansion History from Sne Ia Data Via Information Field Theory: the Charm Code A&A 599, A92 (2017) Astronomy DOI: 10.1051/0004-6361/201629527 & c ESO 2017 Astrophysics Cosmic expansion history from SNe Ia data via information field theory: the charm code Natàlia Porqueres1, Torsten A. Enßlin2, Maksim Greiner2, Vanessa Böhm2, Sebastian Dorn2, Pilar Ruiz-Lapuente4; 3, and Alberto Manrique1; 3 1 University of Barcelona, Departament de Física Quàntica i Astrofísica, Martí i Franquès 1, 08028 Barcelona, Spain e-mail: [email protected] 2 Max-Planck-Insitut für Astrophysik (MPA), Karl-Schwarzschild-Strasse 1, 85741 Garching, Germany 3 Institut de Ciències del Cosmos, Martí i Franquès 1, 08028 Barcelona, Spain 4 Instituto de Física Fundamental, CSIC, Serrano 121, 28006 Madrid, Spain Received 13 August 2016 / Accepted 3 December 2016 ABSTRACT We present charm (cosmic history agnostic reconstruction method), a novel inference algorithm that reconstructs the cosmic expan- sion history as encoded in the Hubble parameter H(z) from SNe Ia data. The novelty of the approach lies in the usage of information field theory, a statistical field theory that is very well suited for the construction of optimal signal recovery algorithms. The charm algorithm infers non-parametrically s(a) = ln(ρ(a)/ρcrit0), the density evolution which determines H(z), without assuming an analyt- ical form of ρ(a) but only its smoothness with the scale factor a = (1 + z)−1. The inference problem of recovering the signal s(a) from the data is formulated in a fully Bayesian way. In detail, we have rewritten the signal as the sum of a background cosmology and a perturbation. This allows us to determine the maximum a posteriory estimate of the signal by an iterative Wiener filter method. Applying charm to the Union2.1 supernova compilation, we have recovered a cosmic expansion history that is fully compatible with the standard ΛCDM cosmological expansion history with parameter values consistent with the results of the Planck mission. Key words. methods: statistical – methods: data analysis – supernovae: general – distance scale 1. Introduction The aim of this work is to reconstruct the cosmic expan- sion history, encoded in the Hubble parameter H(z), from super- Combined observations of nearby and distant type Ia Super- novae data in the framework of Information Field Theory (IFT; novae (SNe Ia) have demonstrated that the expansion of the Uni- Enßlin et al. 2009). Conceptionally, IFT is a statistical field the- verse is accelerating in the current epoch (Perlmutter et al. 1999; ory that permits the construction of optimal signal recovery al- Riess et al. 1998). Such a Universe can be described by the cold gorithms. To this end, we developed the charm1 code, which is dark matter (ΛCDM) model, in which the cosmic acceleration freely available2. We use the Union2.1 Supernova compilation, is determined by Einstein’s cosmological constant with a time- which is a database that contains 580 SNe Ia in the redshift range independent equation of state, ! ≡ p/ρ = −1. However, this of 0:015 < z < 1:414. is just one of the possible explanations of the expansion that is Deriving the cosmic expansion history is a major goal of consistent with the SNe Ia measurements. Others include a new modern cosmology. To date, the low-redshift evolution of the field component filling the Universe as a quintessence or modi- Hubble parameter H(z) has been studied with different methods. fied gravity (Koyama 2016). Some recent studies present analysis of the cosmic expansion 2 Constraining the cosmic expansion as a function of redshift by χ minimization (Bernal et al. 2017; Melia & McClintock is a task of major interest, since the evolution of the scale factor 2015) while others develop non-parametric methods to solve the allows us to probe properties of the fundamental components of inverse problem of reconstructing the Hubble parameter H(z) the Universe. This may lead to a better understanding of their (Li et al. 2016; Montiel et al. 2014; Ishida & de Souza 2011; nature as well possibly providing evidence for new fundamental Shafieloo et al. 2006) or the equation of state of dark energy physics. (España-Bonet & Ruiz-Lapuente 2005, 2008; Simon et al. 2005; Genovese et al. 2009). Common to all non-parametric recon- Recent studies of the baryonic acoustic oscillations (BAO) structions, the ones cited above and the one we develop here, have suggested different constraints on the density of dark is that a quantity to be reconstructed (Hubble parameter, cosmic energy at high redshifts (Delubac et al. 2015; Hee et al. density, equation of state, etc.) as well a regularization for the 2016). Such a change in the evolution of the dark energy den- otherwise ill-posed inference problem must be chosen. The dis- sity in the early epoch could be determined from SNe Ia cussed methods differ in what regularization is chose. data at high redshifts (z > 1), which will be available shortly Here, we develop a non-parametric reconstruction in natu- (Rubin et al. 2016). In addition, some years from now a sam- ral coordinates for the reconstruction of the logarithm of the ple of 105 SNe Ia is expected to be available from the LSST (LSST Science Collaboration et al. 2009). This upcoming data 1 charm stands for cosmic history agnostic reconstruction method. will open an entirely new chapter in the study of dark energy. 2 https://gitlab.mpcdf.mpg.de/natalia/charm Article published by EDP Sciences A92, page 1 of9 A&A 599, A92 (2017) cosmic density s = ln(%=%crit0) as a function of the logarithm background cosmology, denoted as tbg(x): of the cosmic scale factor x = − ln a and thereby the Hubble parameter H(z). The regularization arises from a Bayesian prior ! 1 1 y −1 on potential solutions s(x). We construct this prior from the in- P(s) = G(s − tbg; S ) = p exp − (s − tbg) S (s − tbg) ; j j 2 sight that constituents of the cosmic density are likely to scale 2πS with the inverse scale factor to some power typically (but not (1) exclusively) between zero (cosmological constant) and four (ra- S S hφ φyi φ diation). Translated to the log-log coordinates we advocate this where is the prior covariance matrix = (sjS ) with = s − tbg. Scalar products of continuous quantities are defined as to be natural, this means that straight lines in s(x) are preferred R 1 ayb ≡ x a x b x over curved ones. We can also motivate the level of expected 0 d ( ) ( ). 2 curvature: a transition from radiation to dark energy domination The diagonal of the prior covariance, S xx = hφxi(sjS ), encodes within a few e-folds of expansion has to be possible if our stan- how much variation of the signal around the a priori background dard cosmological expansion history should be embraced by the cosmology tbg is expected a priori at every location x. The off- prior. diagonal terms of the covariance, S xy = hφxφyi(sjS ), specify how An advantage of the adopted Bayesian methodology lies in correlated such deviations form the background cosmology are the fact that it provides a flexible framework to question data: expected to be between the points x and y. A larger correlation it can reconstruct the cosmic expansion history using different corresponds to smoother structures of the deviations. In Sect.5, priors. For example, it can be asked how much the data re- we will use simple and intuitive arguments about the expected quests a modification of a given cosmology or what the pref- roughness of s to specify S , as well as different choices of the ered expansion history is from a cosmological composition ag- background cosmology tbg. In particular, as no location of cos- nostic point of view. The main assumption is a smooth behavior mic history is singled out a priori on a logarithmic scale, the prior of the logarithm of the density ln ρ with the logarithmic scale covariance structure should be homogeneous, S xy = Cs(jx − yj), factor, ln a, whereas the strength of this assumption can also be with Cs(r) a correlation function that only depends on the dis- varied. tance r = jx − yj. We probe that charm is sensitive to features in expansion history at any low-redshift, z < 1:5. In addition, the algorithm 2.2. Signal inference is easily extendible to include other datasets, such as BAO or Cepheids (Riess et al. 2016), which provide information of a In the inference problem, we are interested in the probability transition epoch between deceleration and acceleration of the of the signal given the data. This is described by the posterior cosmic expansion (Moresco et al. 2016; Hee et al. 2016). P(sjd), given by Bayes’ Theorem, We develop and test charm, so that it is ready for application P(djs)P(s) P(sjd) = ; (2) to the new catalog Union3 compilation, which is expected to pro- P(d) vide information about the dark energy density at high-redshifts. After this introduction, we establish our notation and present which is the product of the likelihood P(djs) and the signal prior our assumptions and the inference problem in Sect. 2. In Sect. 3, P(s) normalized by the evidence P(d). the SN Ia catalog is described and we derive our reconstruction In the framework of IFT, inference problems are formulated method in Sect. 4. In Sect. 5, we specify our prior knowledge and in the language of statistical field theory. To that end we rewrite the cosmological expansion histories that we use to test charm. the posterior P(sjd) as We present a comparison of charm with previous literature in P(djs)P(s) 1 Sect. 6. Finally, we present the results of the reconstruction in P(sjd) = = e−H(d;s); (3) Sect.
Recommended publications
  • Turbulence & Information Field Dynamics
    Turbulence & Information field dynamics Torsten Enßlin MPI für Astrophysik Information field dynamics Computer simulation of fields are essential in astrophysics and elsewhere: hydrodynamics, MHD, cosmic structure formation, plasma processes, cosmic ray transport, … Computer simulations need to discretize the space. This erases information on small-scale structures/processes. How can knowledge on sub-grid processes be used to improve the simulation accuracy? Idea: use Information Field Theory (IFT) to bridge between discretized field (data in computer memory) and continuous field configurations (possible physical field realization). data in computer memory data in computer memory data in computer memory configuration space signal inference data in computer memory configuration space configuration space time evolution signal inference data in computer memory configuration space configuration space time evolution signal inference data in computer memory data in computer memory configuration space configuration space time evolution signal entropic inference matching data in computer memory data in computer memory Recipe 1. Field dynamics: Specify the field dynamics equations. 2. Prior knowledge: Specify the ignorance knowledge for absent data. 3. Data constraints: Establish the relation of data and the ensemble of field configurations being consistent with data and background knowledge. Assimilation of external measurement data into the simulation scheme is naturally done during this step. 4. Field evolution: Describe the evolution of the field ensemble over a short time interval. 5. Prior update: Specify the background knowledge for the later time. 6. Data update: Invoke again the relation of data and field ensemble to construct the data of the later time. Use for this entropic matching based on the Maximum Entropy Principle (MEP).
    [Show full text]
  • Radio Interferometry with Information Field Theory
    Radio Interferometry with Information Field Theory Philipp Adam Arras München 2021 Radio Interferometry with Information Field Theory Philipp Adam Arras Dissertation an der Fakultät für Physik der Ludwig–Maximilians–Universität München vorgelegt von Philipp Adam Arras aus Darmstadt München, den 14. Januar 2021 Erstgutachter: PD Torsten A. Enßlin Zweitgutachter: Prof. Jochen Weller Tag der mündlichen Prüfung: 18. März 2021 Abstract The observational study of the universe and its galaxy clusters, galaxies, stars, and planets relies on multiple pillars. Modern astronomy observes electromagnetic sig- nals and just recently also gravitational waves and neutrinos. With the help of radio astronomy, i.e. the study of a specific fraction of the electromagnetic spectrum, the cosmic microwave background, atomic and molecular emission lines, synchrotron ra- diation in hot plasmas, and many more can be measured. From these observations a variety of scientific conclusions can be drawn ranging from cosmological insights to the dynamics within galaxies or properties of exoplanets. The data reduction task is the step from the raw data to a science-ready data prod- uct and it is particularly challenging in astronomy. Because of the impossibility of independent measurements or repeating lab experiments, the ground truth, which is essential for machine learning and many other statistical approaches, is never known in astronomy. Therefore, the validity of the statistical treatment is of utmost impor- tance. In radio interferometry, the traditionally employed data reduction algorithm CLEAN is especially problematic. Weaknesses include that the resulting images of this algo- rithm are not guaranteed to be positive (which is a crucial physical condition for fluxes and brightness), it is not able to quantify uncertainties, and does not ensure consis- tency with the measured data.
    [Show full text]
  • A Bayesian Model for Bivariate Causal Inference
    entropy Article A Bayesian Model for Bivariate Causal Inference Maximilian Kurthen * and Torsten Enßlin Max-Planck-Institut für Astrophysik, Karl-Schwarzschildstr. 1, 85748 Garching, Germany; [email protected] * Correspondence: [email protected] Received: 30 October 2019; Accepted: 24 December 2019; Published: 29 December 2019 Abstract: We address the problem of two-variable causal inference without intervention. This task is to infer an existing causal relation between two random variables, i.e., X Y or Y X, from ! ! purely observational data. As the option to modify a potential cause is not given in many situations, only structural properties of the data can be used to solve this ill-posed problem. We briefly review a number of state-of-the-art methods for this, including very recent ones. A novel inference method is introduced, Bayesian Causal Inference (BCI) which assumes a generative Bayesian hierarchical model to pursue the strategy of Bayesian model selection. In the adopted model, the distribution of the cause variable is given by a Poisson lognormal distribution, which allows to explicitly regard the discrete nature of datasets, correlations in the parameter spaces, as well as the variance of probability densities on logarithmic scales. We assume Fourier diagonal Field covariance operators. The model itself is restricted to use cases where a direct causal relation X Y has to be decided against a ! relation Y X, therefore we compare it other methods for this exact problem setting. The generative ! model assumed provides synthetic causal data for benchmarking our model in comparison to existing state-of-the-art models, namely LiNGAM, ANM-HSIC, ANM-MML, IGCI, and CGNN.
    [Show full text]
  • Cosmic Magnetism with Information Field Theory
    Cosmic Magnetism with Information Field Theory Torsten Enßlin, Mona Frommert, Henrick Junklewitz, Niels Oppermann, Georg Robbers, Petr Kuchar Max-Planck-Institut für Astrophysik magneticmagnetic feldfeld observablesobservables data: NVSS, Taylor et al. 2009 data: WMAP, WMAP-team map: Oppermann et al., arXiv:1008.1243 map: Page et al. magneticmagnetic helicityhelicity ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS ProbingProbing HelicityHelicity byby LITMUSLITMUS TheThe FaradayFaraday rotationrotation skysky Taylor,Stil,Sunstrum 2009 37,534 RM extragalactic RM-sources in the northern sky InformationInformation FieldField TheoryTheory { Why Information Field Theory ? inverse problem => Information Theory spatially distributed quantity => Field Theory Information fields of interest: ● magnetic field in 3d ● Faraday rotation map ● polarized emissivity per Faraday depth ● cosmic matter distribution ● magnetic power spectra, ... Information sources: ● radio polarimetry: RM ● radio polarimetry: total I, PI ● galaxy surveys ● X-rays, ... What is information theory ? What is information? knowledge about the set of possibilities Ω their individual probabilities
    [Show full text]
  • Book of Abstracts 40Th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering
    Book of Abstracts 40th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering Wolfgang von der Linden, Sascha Ranftl and MaxEnt Contributors May 31, 2021 ii Welcome Address Dear long-time companions and dear newcomers to the realm of Bayesian inference, "the logic of science" as E.T. Jaynes put it. Welcome to the 40-th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering! It seems like yesterday that the series was initiated by Myron Tribus and Edwin T. Jaynes and organised for the first time by C.Ray Smith and Walter T. Grandy in 1981 at the University in Wyoming. Since then, tremendous progress has been made and the number of publications based on Bayesian methods is impressive. De- spite the Corona pandemic, the 40th anniversary can finally happen, albeit a year late and only digitally. Nevertheless, we will have ample opportunity for reviews on the successes of the last four decades and for extensive discussions on the latest developments in Bayesian inference. In agreement with the general framework of the annual workshop, and due to the broad applicability of Bayesian inference, the presentations will cover many research areas, such as physics (plasma physics, astro-physics, statistical mechanics, founda- tions of quantum mechanics), geodesy, biology, medicine, econometrics, hydrology, measure theory, image reconstruction, communication theory, computational engi- neering, machine learning, and, quite timely, epidemiology. We hope you will enjoy the show Wolfgang von der Linden and Sascha Ranftl (Graz University of Technology, 2021) iii iv Contents Welcome Address iii Special guests 1 40 Years of MaxEnt: Reminiscences (John Skilling)............
    [Show full text]
  • Information Field Theory
    InformationInformation fieldfield theorytheory Max-Planck-Institut turningturning datadata intointo imagesimages für Astrophysik Torsten Enßlin Michael Bell, Mona Frommert, Maskim Greiner, Jens Jasche, Henrik Junklewitz, Francisco Kitaura, Niels Oppermann, Georg Robbers, Marco Selig, Cornelius Weig signal fields: ∞ degrees of freedom data sets: finite → additional information needed information: physical laws, symmetries, continuity, statistical homogeneity/isotropy, ... space is continuous → information field theory InformationInformation HamiltonianHamiltonianTheoryTheory s = signal d = data posterior likelihood prior evidence inferenceinference problemproblem asas informationinformationstatisticalstatistical fieldfield theorytheory An information field theory is defined over continuous space … … so that the pixelization is irrelevant if sufficiently fine FreeFree TheoryTheory GaussianGaussian signalsignal && noise,noise, linearlinear responseresponse WienerWienerFreeFree filterfilter TheoryTheory theorytheory GaussianGaussian signalsignalknown && for noise,noise, 60 yearslinearlinear responseresponseinformationinformation sourcesource informationinformation propagatorpropagator signalsignal fieldfield mock signal from known power spectrum datadata == integratedintegrated signalsignal informationinformation sourcesource reconstructionreconstruction InteractingInteracting TheoryTheory non-Gaussiannon-Gaussian signal,signal, noise,noise, oror non-linearnon-linear responseresponse IFTIFT dictionarydictionary Translation: inference problem
    [Show full text]
  • Arxiv:0806.3474V1 [Astro-Ph] 20 Jun 2008 Abilities Omto.Ti Ro Otisorkoldeaotthe About Knowledge Our Contains Prior This Formation
    Information field theory for cosmological perturbation reconstruction and non-linear signal analysis Torsten A. Enßlin, Mona Frommert, and Francisco S. Kitaura Max-Planck-Institut f¨ur Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching, Germany (Dated: June 20, 2008) We develop information field theory (IFT) as a means of Bayesian, data based inference on spatially distributed signals, the information fields. A didactic approach is attempted in order to enable scientists not familiar with field theories to implement and apply inference algorithms derived within the IFT framework. Starting from general considerations on the nature of measurements, signals, noise, and their relation to a physical reality, we derive the information Hamiltonian, the source field, propagator, and interaction terms. Free IFT reproduces the well known Wiener-filter theory. Interacting IFT can be diagrammatically expanded, for which we provide the Feynman rules in position-, Fourier-, and spherical harmonics space. The theory should be applicable in many fields. However, here, two cosmological signal recovery problems are discussed in detail in their IFT-formulation. 1) Reconstruction of the cosmic large-scale structure matter distribution from discrete galaxy counts in incomplete galaxy surveys. It is demonstrated analytically and numerically that a Gaussian signal, which should resemble the initial density perturbations of the Universe, observed with a strongly non-linear, incomplete and Poissonian-noise affected response, as the processes of structure and galaxy formation and observations provide, can be reconstructed thanks to the virtue of a response-renormalisation flow equation. Surprisingly, solving this equation numerically is much less expensive than solving the corresponding classical field equation, which means to calculate the maximum a posteriori estimator, despite the former’s higher fidelity.
    [Show full text]
  • Information Field Theory
    Information field theory Torsten Enßlin Max Planck Institute for Astrophysics Karl-Schwarzschildstr. 1, 85741 Garching bei München, Germany http://www.mpa-garching.mpg.de/ift Abstract. Non-linear image reconstruction and signal analysis deal with complex inverse prob- lems. To tackle such problems in a systematic way, I present information field theory (IFT) as a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms even for non- linear and non-Gaussian signal inference problems. IFT algorithms exploit spatial correlations of the signal fields and benefit from techniques developed to investigate quantum and statistical field theories, such as Feynman diagrams, re-normalisation calculations, and thermodynamic potentials. The theory can be used in many areas, and applications in cosmology and numerics are presented. Keywords: INFORMATION THEORY, FIELD THEORY, IMAGE RECONSTRUCTION PACS: 87.19.lo, 11.10.-z , 42.30.Wb INFORMATION FIELD THEORY Field inference A physical field is a function over some continuous space. The air temperature over Europe, the magnetic field within the Milky Way, or the dark matter density in the Universe are all fields we might want to know as accurately as possible. Fortunately, we have measurement devices delivering us data on these fields. But the data is always finite in size, whereas any field has an infinite number of degrees of freedom, the field values at all locations of the continuous space the field is living in. Since it is impossible to determine an inifinte number of unknowns from a finite number of constraints, an exact field reconstruction from the data alone is impossible.
    [Show full text]
  • Information Field Dynamics for Simulation Scheme Construction
    Information field dynamics for simulation scheme construction Torsten A. Enßlin∗ Max Planck Institute for Astrophysics, Karl-Schwarzschildstr.1, 85741 Garching, Germany Information field dynamics (IFD) is introduced here as a framework to derive numerical schemes for the simulation of physical and other fields without assuming a particular sub-grid structure as many schemes do. IFD constructs an ensemble of non-parametric sub-grid field configurations from the combination of the data in computer memory, representing constraints on possible field configurations, and prior assumptions on the sub-grid field statistics. Each of these field configu- rations can formally be evolved to a later moment since any differential operator of the dynamics can act on fields living in continuous space. However, these virtually evolved fields need again a representation by data in computer memory. The maximum entropy principle of information theory guides the construction of updated datasets via entropic matching, optimally representing these field configurations at the later time. The field dynamics thereby become represented by a finite set of evolution equations for the data that can be solved numerically. The sub-grid dynamics is thereby treated within auxiliary analytic considerations. The resulting scheme acts solely on the data space. It should provide a more accurate description of the physical field dynamics than sim- ulation schemes constructed ad-hoc, due to the more rigorous accounting of sub-grid physics and the space discretization process. Assimilation of measurement data into an IFD simulation is con- ceptually straightforward since measurement and simulation data can just be merged. The IFD approach is illustrated using the example of a coarsely discretized representation of a thermally excited classical Klein-Gordon field.
    [Show full text]
  • Radio Imaging with Information Field Theory
    Radio Imaging with Information Field Theory Philipp Arras∗y, Jakob Knollmuller¨ ∗, Henrik Junklewitz and Torsten A. Enßlin∗ ∗Max-Planck Institute for Astrophysics, Garching, Germany yTechnical University of Munich, Munich, Germany Abstract—Data from radio interferometers provide a substan- gives a quick introduction to information field theory followed tial challenge for statisticians. It is incomplete, noise-dominated by section IV in which the Bayesian hierarchical model used and originates from a non-trivial measurement process. The by RESOLVE is explained. We conclude with an application signal is not only corrupted by imperfect measurement devices but also from effects like fluctuations in the ionosphere that act on real data in section V. as a distortion screen. In this paper we focus on the imaging part II. MEASUREMENT PROCESS AND DATA IN RADIO of data reduction in radio astronomy and present RESOLVE, a Bayesian imaging algorithm for radio interferometry in its new ASTRONOMY incarnation. It is formulated in the language of information field Radio telescopes measure the electromagnetic sky in wave- theory. Solely by algorithmic advances the inference could be lengths from λ = 0:3 mm (lower limit of ALMA) to 30 m sped up significantly and behaves noticeably more stable now. This is one more step towards a fully user-friendly version of (upper limit of LOFAR). This poses a serious problem. The RESOLVE which can be applied routinely by astronomers. angular resolution of a single-dish telescope δθ scales with the wavelength λ divided by the instrument aperture D: I. INTRODUCTION λ δθ = 1:22 : To explore the origins of our universe and to learn about D physical laws on both small and large scales telescopes of As an example consider λ = 0:6 cm and δθ = 0:1 arcsec various kinds provide information.
    [Show full text]
  • NIFTY – Numerical Information Field Theory⋆
    A&A 554, A26 (2013) Astronomy DOI: 10.1051/0004-6361/201321236 & c ESO 2013 Astrophysics NIFTY? – Numerical Information Field Theory A versatile PYTHON library for signal inference M. Selig1, M. R. Bell1, H. Junklewitz1, N. Oppermann1, M. Reinecke1, M. Greiner1;2, C. Pachajoa1;3, and T. A. Enßlin1 1 Max Planck Institute für Astrophysik, Karl-Schwarzschild-Straße 1, 85748 Garching, Germany e-mail: [email protected] 2 Ludwig-Maximilians−Universität München, Geschwister-Scholl-Platz 1, 80539 München, Germany 3 Technische Universität München, Arcisstraße 21, 80333 München, Germany Received 5 February 2013 / Accepted 12 April 2013 ABSTRACT NIFTy (Numerical Information Field Theory) is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython,C++, and C for efficiency. NIFTy offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTy permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTy operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined.
    [Show full text]
  • Reconstructing Non-Repeating Radio Pulses with Information Field Theory
    Prepared for submission to JCAP Reconstructing non-repeating radio pulses with Information Field Theory C. Wellinga;b P. Frankc;d T. Enßlinc;d A. Nellesa;b aDESY, Platanenalle 6, Zeuthen, Germany bErlangen Center for Astroparticle Physics, Friedrich-Alexander-Universit¨atErlangen-N¨urn- berg, Erwin-Rommel-Str. 1, 91058 Erlangen, Germany cMax Planck Institute for Astrophysics, Karl-Schwarzschildstraße 1, 85748 Garching, Ger- many dLudwig-Maximilians-Universit¨atM¨unchen, Geschwister-Scholl-Platz 1, 80539, M¨unchen, Germany E-mail: [email protected] Abstract. Particle showers in dielectric media produce radio signals which are used for the detection of both ultra-high energy cosmic rays and neutrinos with energies above a few PeV. The amplitude, polarization, and spectrum of these short, broadband radio pulses allow us to draw conclusions about the primary particles that caused them, as well as the mechanics of shower development and radio emission. However, confidently reconstructing the radio signals can pose a challenge, as they are often obscured by background noise. Information Field Theory offers a robust approach to this challenge by using Bayesian inference to calculate the most likely radio signal, given the recorded data. In this paper, we describe the application of Information Field Theory to radio signals from particle showers in both air and ice and demonstrate how accurately pulse parameters can be obtained from noisy data. arXiv:2102.00258v2 [astro-ph.HE] 17 Mar 2021 Contents 1 Introduction1 1.1 Detecting radio signals
    [Show full text]