Unlike in many other streams of science, in cosmology we do not have the luxury of experiments. We have one Universe which we can observe. We can observe it at different wavelengths with different resolutions at different angles. The understanding of the origin and evolution of the structures in the Universe that we have today have been obtained by comparing our theories against these observational data spanning from small to the largest scales. Constraints on our theoretical models help to shape the understanding. Model independent reconstructions directly from the data also provides hints towards model building.
In my opinion, model comparison and reconstruction both are indispensable approaches in modern cosmology. Data analysis — Hints on model — Model building — Comparison of models with the data, these complete the circle of an analysis in cosmology. Therefore, I always advocate the development of algorithms for data analysis, in the building of models using hints from reconstruction and in the development of numerical codes to solve theoretical models and to compare them with the data.
Before starting with these semi-technical details, below is a brief timeline of the Universe (Image credit: Satellite: Planck, Copyright: ESA - C. Carreau) and the cosmological processes that followed since its origin. Some of these details are needed to associate my works with the processes.
In cosmology we deal with large distances. When we observe an object at a large distance, we do observe its image at an earlier time, since light has taken that much time to reach us. In common practice, we use a quantity named 'redshift' (z) as a measure of distance. At this moment of time, here at this place, redshift is defined as 0. If a luminous object is situated at redshift z, we observe the wavelength of the light elongated by (1+z) compared to its original wavelength. This expansion is quantified by Hubble constant H0. At present any two points in space separated by 1 megaparsec (3 x 1019 Km) are moving away from each other H0 km in each second. This elongation is a signature of the expansion of the Universe. Given the densities of different components of the Universe, such as matter, dark energy density and the H0 we can calculate the distance of an object at redshift z. Since, the Universe is expanding, back-tracing the light-cone we can estimate the age of the Universe today or at any z. One can use the Cosmology Calculator for a quick estimation. Timeline shown above represents the possible origin and evolution of the Universe as estimated by European Space Agency's Planck satellite. We can divide the total timeline according to the dominant component of the Universe. Initially we had an exponential expansion of the Universe dominated by scalar field, termed as inflation. Then followed the radiation dominated and matter dominated epochs with decelerated expansion. Presently we are in a dark energy dominated epoch with accelerated expansion.
Below, I provide a list of a few codes, models and algorithms/methods that I have developed or modified from existing versions in different publications with my collaborators. These tools are extremely useful in understanding cosmological processes at different epochs.
In this gitlab repository, codes (in Jypyter notebook format) will be uploaded that are useful to study cosmology (for MSc and PhD students). They are useful for teaching as well. Click on the gitlab thumbnail on the right.
We start with a simple code that calculates the age of the Universe based on the best fit parameters obtained using the data from the Planck mission (https://www.cosmos.esa.int/web/planck).
Here are the contents of each notebook:
Age-of-the-Universe.ipynb
Friedmann equations
Compute the age of the Universe
Ages-Distances-Angles.ipynb
Solutions to Friedmann equations
Rate of expansion of the Universe
Age of the Universe today
Age at a particular redshift
Comoving radial distance
Angular diameter distance
Luminosity Distance
Acceleration of the Universe
Dark-Energy-Density-LCDM-Constraints.ipynb
Luminosity distance
Supernovae
Distance modulus
$\chi^2$
Best fit
Markov Chain Monte Carlo
Constraints
Dark-Energy-Models-Parametrizations.ipynb
Cosmological Distances with variable equation state for dark energy
Scalar field model of dark energy
Dark energy parametrization
CPL parametrization
Phenomenological models of decaying dark energy
BAO
CMB summary statistics
Parameter estimation
Reionization.ipynb
Tanh Model of Reionization
Optical depth
Neutral Hydrogen
CMB
Constraints
Inflation-Primordial-Perturbation.ipynb
Horizon problem
Solution to horizon problem -- Inflation
Solving Klein-Gordon equation
Solving Mukhanov-Sasaki/Curvature perturbation equations
Solving the tensor perturbation equation
Primordial scalar and tensor power spectra
Comparison with slow roll approximation
Added mini-codes directory
Contents
Fitting-FIRAS-data.ipynb
Fitting FIRAS CMB spectrum (monopole)
Constraining spectral distortions from monopole
Here I mention the codes that I developed for research purposes. Some of these codes are publicly available.
Evaluates the bispectrum parameter fNL in generic, single field, inflationary models which is a publicly available open source code. The results presented in the work JCAP 1305, 026 (2013) were obtained using this code. Initially, the code was written so as to evaluate the parameter fNL in the equilateral limit. The second release of the BINGO (BINGO-2.0) computes the fNL in an arbitrary triangular configuration of the wavenumbers (the results and analyses of BINGO-2.0 is provided in JCAP 02, 029 (2015)). The current version allows the user to compute the fNL in multiple processors. Detailed discussions on BINGO can be found in the BINGO homepage from where the code can be downloaded. This is the first public code to calculate the inflationary bispectrum. I am also working towards integrating the code with CAMB, so that it can be used in the future to compare inflationary models with observations including non-Gaussianities.
Since BINGO calculates the bispectrum, which requires the computation of the primordial power spectrum (PPS), BINGO certainly can compute the power spectrum for inflationary models. To my knowledge, it is one of the fastest codes to solve the models of inflation that does not make analytical approximations. I have extended the scope of the PPS codes beyond single canonical scalar field models. The present Code-Suite (unreleased) can solve the following models of the early Universe.
Non-canonical models of single field inflation: this version of the code was used in JCAP 1010, 008 (2010)
Two field model of inflation involving canonical and non-canonical Lagrangian: this version has been used in JCAP 2008, 025 (2020), to appear in JCAP shortly. The code takes the iso-curvature perturbations into account exactly, and it evaluates the curvature and iso-curvature perturbation spectra as well as the correlations between them.
Tensor power spectrum: as a simple extension, the PPS code also calculates the primordial tensor spectrum.
I have used the PPS Code-Suite within publicly available Markov Chain Monte Carlo codes such as CosmoMC as an add-on. This allows us to obtain the angular power spectra from temperature and polarization anisotropies and also the correlation function of the galaxy number counts based on the power spectrum originated from the primordial perturbation. Utilizing this add-on, in several publications with my collaborators, I have obtained constraints on inflation using data from Cosmic Microwave Background (Wilkinson Microwave Anisotropy Probe, Atacama Cosmology Telescope, Planck, BICEP-Keck etc.) and Large Scale Structure surveys (Sloan Digital Sky Survey) and also projected constraints based on the proposed and approved CMB (CMB-BHARAT, CORE, LiteBIRD) and LSS missions (Euclid).
I developed a code to compute the number density of halos and their formation rates for any inflationary models with arbitrary potential. The code can calculate the number density of dark matter halos in both Press-Schechter and Sheth-Tormen formalism. The work JCAP 1303, 003 (2013) is based on this code.
Extending the scope of the Halo formation code, with Matteo Braglia, I have developed a code to estimate formation of primordial black holes (PBHs) and their fractional presence in the cold dark matter. The code also computes the density of stochastic background gravitational waves across the frequencies that will be covered by the proposed and upcoming gravitational waves experiments. Results of the publication (JCAP 2008 001 (2020)) are based on this code.
Jointly with Dr. Stephen Appleby, I have developed a code to simulate the large scale structure of the Universe. We solve the 3D dark matter density fields starting from an initial over-density in a horizon size box. We are presently bench-marking the code.
I have developed a code to estimate the continuum of the observed spectra of Quasars in the redshifted Lyman-α absorption window. Continuum estimation is essential to measure the absorption of photons by neutral hydrogen clouds in the intergalactic medium. Correlation of the fluctuations in the absorbed flux can be used to constrain the background cosmological model as well as the primordial physics. Initial application to BOSS-DR9 data showed promising results. I am presently examining the consistency of the continuum estimation with simulated results.
I had prepared a numerical code (in Fortran 90) to calculate the reionization optical depth from a non-linear matter density field. My project work Constraints on the Matter Power Spectra From the Lyman-α Forest, is based on this code.
Open these images in a new tab to view them in better resolutions. Clicking on the image will redirect to the publication page
BINGO example: fNL computed for inflaton potential with a step
PPS Code-Suite example: Features in the two field model for varying non-canonical coupling
PPS Code-Suite example: Angular power spectra from feature models compared against Planck 2015 data
PBH code example: Fraction of PBH, fPBH as a function of the mass of the formed PBHs in solar masses. Amplification of primordial power spectra at small scales results in formation of PBHs. Observational upper limits are presented.
PBH code example: Relic energy density of gravitational waves computed directly using two field model Lagrangian
WWI: Power spectra from models within the WWI framework with large scale suppression and wiggles that are supported by Planck data
Poly-Reion: Recent constraints on reionization from Planck 2018 data using Poly-Reion model
Results from Reionization History Reconstruction (open image in new tab for better view)
We introduced Whipped Inflation (Phys.Rev.Lett. 113 (2014) 7, 071301) and Wiggly Whipped Inflation (JCAP 08 (2014) 048) frameworks as models of inflation that can generate wide range of primordial features supported by recent Planck CMB observation of temperature and polarization anisotropies. Scalar fields while rolling down from a steep potential to a nearly flat potential through a discontinuity can induce primordial scalar perturbations with large scale suppression and wiggles. Apart from comparing these models with the Planck CMB data, we have made projections with proposed and upcoming CMB and LSS missions. See, JCAP 02, 017 (2018), MNRAS 477 (2018) 2, 2503-2512 and MNRAS, 496 (2020) 3, 3448–3468.
Using the free electron fractions at different redshifts as free parameters, the Poly-Reion model constructs the complete reionization history using polynomials. Our construction provides an extremely flexible framework to search for the history of reionization as a function of redshifts. Importantly Poly-Reion allows only realistic histories of reionization. We first proposed the model with the constraints on the history of reionization in JCAP 11, 028 (2017). With a modification of flexible redshifts, this model was used in Planck 2018 analysis of reionization. Our analysis of Planck 2015 data with Poly-Reion indicated marginal preference for extended history of reionization. We also find that the free electron fraction allowed by the data for redshifts more than 15 is <0.25 at 95.4% confidence limit in the case of optimistic constraint. With new Planck 2018 data, our analysis in JCAP 2009 005 (2020) we find early onsets of reionization are strongly disfavored. This model also helps to explore the degeneracies between reionization and other cosmological processes.
In the context of reionization there are multiple observations of different types that are useful to provide constraints on the history. CMB constrains mainly the optical depth which is the integrated free electron fractions along the line of sight. Observations of UV luminosity of high-redshift galaxies directly probes the sources of reionization. Lyman-α observations on the other hand estimates the neutral hydrogen at different redshifts. We have developed a method of reconstructing reionization history utilizing these datasets in a single free-form framework. Here we solve the ionization equations where the source and sink terms are assumed to be free of form. We find, using three datasets, the duration of reionization to be nearly 3 redshifts with the uncertainty of 0.3 redshifts at 95% confidence limit. The opacity factor of Universe to CMB radiation, i.e. the optical depth, τ is estimated to be 0.051 with 0.002 uncertainty at 95% CL. Importantly in this reconstruction, we can reverse engineer the recombination timescale (see the plots on the left) that puts a lower bound on the recombination timescale to be > 1 Giga-Years. The reconstruction methodology and constraints are presented in the letter Phys. Rev. Lett. 125, 071301 (2020).
In the paper JCAP 1307, 031 (2013) we have modified and extended the scope of the improved Richardson-Lucy deconvolution algorithm for reconstruction of primordial power spectrum. The modified algorithm is optimized to work with the binned and unbinned CMB angular power spectrum in a combined analysis. I have developed a code based on the modified algorithm that is designed to work with Planck like data including foreground and gravitational lensing effects. Elaborate analyses with Planck data are published JCAP 1411 011 (2014). I have produced a CAMB and CosmoMC add-on of this modified code to have a parameter estimation with a free form primordial power spectrum. The papers JCAP 1307, 031 (2013) and Phys. Rev. D 87, 123528 (2013) are based on this newly developed code and its add-on.
We have presented a method to find broad features in particular datasets using flexible bins in the article JCAP 1312 035 (2013). This technique enables us to address the location of the possible broad physical features in the primordial spectrum with relatively smaller numbers of bins compared to other analyses. Implementing this method to the CMB data from Planck, we showed that the spectral tilt can be constrained to be red only after 0.01 MPc-1.
In Phys. Rev. Lett. 109, 121301 (2012) we introduced the possibility of constraining primordial non-Gaussianity using the 3D bispectrum of Lyman-α forest where the Lyman-α transmitted flux field is modeled as a biased tracer of the underlying matter distribution. In the next work JCAP 04 (2013) 002 we used auto and cross bispectrum of the Lyman-α forest and the redshifted 21-cm signal from the post-reionization epoch to constrain primordial non-Gaussianity.
We presented the first test of isotropy of the Universe using the Lyman-α forest data from the high redshift quasars (z>2) (the signal from the matter dominated epoch) from SDSS-III BOSS-DR9 datasets in JCAP 11 (2015) 012. We proposed a method to utilize the probability distribution function (PDF) of the Lyman-α forest transmitted flux and use the statistical moments of the PDF to trace the isotropy of the Universe as a function of time. Since Lyman-α transmitted flux directly maps the neutral hydrogen distribution in the intergalactic medium, this method can trace whether the distribution of the neutral hydrogen in the Universe is consistent with isotropic distribution.
MRL application: Samples of reconstructed primordial power spectrum from Planck 2018 temperature data and the fit to the data w.r.t. power law model are in the first two plots. Constraints on the PPS at different scales are plotted in the third plot [JCAP 1411, 011 (2014)]
Test of isotropy: The BOSS-DR9 survey area in galactic coordinates and our selection of sky patches fort the test of isotropy.