Figure 1: Projections of gas (top left), dark matter (top right), and stellar light (bottom center) for a slice in the largest hydrodynamical simulation of MillenniumTNG at the present epoch. The slice is about 35 million light-years thick. The projections show the vast physical scales in the simulation from the full box size, about 2400 million light-years across, to an individual spiral galaxy (final round inset) with a radius of ~150 000 light-years. The underlying calculation is presently the largest high-resolution hydrodynamical simulation of galaxy formation, containing more than 160 billion resolution elements. © MPA
New computer simulations follow the formation of galaxies and the cosmic large-scale structure with unprecedented statistical precision
An international team of astrophysicists led by researchers from the Max Planck Institute for Astrophysics in Germany, Harvard University in the USA, and Durham University in the UK has presented an ambitious attempt to jointly simulate the formation of galaxies and cosmic large-scale structure throughout staggeringly large swaths of space. Their simulations now also take the ghostly neutrino particles into account and could help to constrain their mass. First results of their “MillenniumTNG” project have just been published in a series of 10 articles in the journal Monthly Notices of the Royal Astronomical Society. The new calculations help to subject the standard cosmological model to precision tests and to unravel the full power of upcoming new cosmological observations.
Over the past decades, cosmologists have gotten used to the perplexing conjecture that the universe’s matter content is dominated by enigmatic dark matter and that an even stranger dark energy field acts as some kind of anti-gravity to accelerate the expansion of today’s cosmos. Ordinary baryonic matter makes up less than 5% of the cosmic mix, but this source material forms the basis for the stars and planets of galaxies like our own Milky Way. This seemingly strange cosmological model is known under the name LCDM. It provides a stubbornly successful description of a large number of observational data, ranging from the cosmic microwave radiation – the rest-heat left behind by the hot Big Bang – to the “cosmic web”, where galaxies are arranged along an intricate network of dark matter filaments. However, the real physical nature of dark matter and dark energy is still not understood, prompting astrophysicists to search for cracks in the LCDM theory. Identifying tensions with observational data could lead to a better understanding of these fundamental puzzles about our Universe. Sensitive tests are required that need both: powerful new observational data and more detailed predictions for what the LCDM model actually implies.
Scientists at the Max Planck Institute for Astrophysics (MPA), together with an international team of researchers at Harvard University and Durham University, as well as York University in Canada and the Donostia International Physics Center in Spain have now managed to take a decisive step forward on the latter challenge. Building on their previous successes with the “Millennium” and “IllustrisTNG” projects, they developed a new suite of simulation models dubbed “MillenniumTNG”, which trace the physics of cosmic structure formation with considerably higher statistical accuracy than was possible with previous calculations.
Figure 2: Comparison of the neutrino (top) and dark matter (bottom) distributions on the past backwards lightcone of a fiducial observer positioned at the centre of the two horizontal stripes. As cosmic expansion slows down the neutrinos at late times (small redshift/distance), they start to weakly cluster around the biggest concentrations of dark matter as shown by a comparison of the zoomed insets. This slightly increases the mass and further growth rate of these largest structures. © MPA
Large simulations including new physical details
Furthermore, the team included massive neutrinos in their simulations, for the first time in simulations big enough to allow meaningful cosmological mock observations. Previous cosmological simulations had usually omitted them for simplicity, because they make up at most 1-2% of the dark matter mass, and since their nearly relativistic velocities mostly prevent them from clumping together. Now, however, upcoming cosmological surveys (such as those of the recently launched Euclid satellite of the European Space Agency) will reach a precision allowing a detection of the associated percent-level effects. This raises the tantalizing prospect of constraining the neutrino mass itself, a profound open question in particle physics, so the stakes are high.
For the groundbreaking MillenniumTNG simulations, the researchers made efficient use of two extremely powerful supercomputers, the SuperMUC-NG machine at the Leibniz Supercomputing Centre (LRZ) in Garching, and the Cosma8 machine hosted by Durham University on behalf of the UK’s DiRAC High-Performance Computing facility. More than 120 000 computer cores toiled away for nearly two months at SuperMUC-NG, using computing time awarded by the German Gauss Centre for Supercomputing (GCS), to produce the most comprehensive hydrodynamical simulation model to date. MillenniumTNG tracks the formation of about one hundred million galaxies in a region of the universe around 2400 million light-years across (see Figure 1). This calculation is about 15 times bigger than the previously best is this category, the TNG300 model of the IllustrisTNG project.
Using Cosma8, the team computed an even bigger volume of the universe, filled with more than a trillion dark matter particles and more than 10 billion particles for tracking massive neutrinos (see Figure 2). Even though this simulation did not follow the baryonic matter directly, its galaxy content can be accurately predicted in MillenniumTNG with a semi-analytic model that is calibrated against the ‘full physics’ calculation of the project. This procedure leads to a detailed distribution of galaxies and matter in a volume that for the first time is large enough to be representative for the universe as a whole, putting comparisons to upcoming observational surveys on a sound statistical basis.
Figure 3: Galaxy distribution on the past backwards lightcone in MillenniumTNG, where the galaxies are predicted with a sophisticated semi-analytic model on top of the dark matter backbone. Galaxies are shown down to Johnson apparent magnitude 𝑅 < 23, in a 180 degrees wide, thin wedge with opening angle 0.24 degrees, out to redshift 𝑧 = 2. The galaxy positions are drawn as circles with comoving coordinates in real space, using red for galaxies with rest frame color index 𝐵−𝑅 > 0.7, and blue otherwise. Real observations of the galaxy positions would additionally be perturbed by small shifts along the line of sight due to the Doppler effects from the galaxies’ motions, an effect that can also be easily included in the models. The two circular insets show nested zooms with diameters of around 1.25 billion light-years and 125 million light-years, and fainter apparent magnitude limits of 𝑅 < 25 and 𝑅 < 28, respectively.© MPA
Theoretical predictions for cosmology
One of the studies had a look at the shapes of galaxies. Nearby galaxies have the subtle tendency to orient their shapes in similar directions instead of pointing randomly, an effect called “intrinsic galaxy alignments”. This poorly understood effect distorts inferences based on weak gravitational lensing, which creates its own statistical alignment signal. The MillenniumTNG project could, for the first time, measure intrinsic alignments with very high signal-to-noise directly from the shapes of the simulated galaxies, out to distances of several hundred million light-years. “Perhaps our determination of the intrinsic alignment of galaxy orientations can help to resolve the current discrepancy between the amplitude of matter clustering inferred from weak lensing and from the cosmic microwave background“, says PhD-student Ana Maria Delgado, first author of this study by the MillenniumTNG team. Using these results, astronomers will be able to correct for this important systematic effect much better.
Another timely result refers to the recent discovery of a population of very massive galaxies in the young universe with the James Webb Space Telescope. The masses of these galaxies are unexpectedly large just a brief time after the Big Bang, seemingly defying theoretical expectations. Dr. Rahul Kannan analyzed the predictions of MillenniumTNG for this early epoch. While the simulations agree with the observations out to redshifts of z=10 (when the universe was less than 500 million years old), he confirmed that, if they hold up, the new results by JWST at even higher redshift will be in conflict with the simulation predictions. “Perhaps star formation is much more efficient shortly after the Big Bang than at later times, or maybe massive stars are formed in higher proportions back then, making these galaxies unusually bright”, explains Dr. Kannan.
Other works of the team’s initial analysis focus on the clustering signals of galaxies. For example, MPA PhD student Monica Barrera produced extremely large and highly realistic mock catalogues of galaxies on the past backwards “lightcone” of a fiducial observer (see Figure 3). In this case, galaxies that are more distant are also automatically younger, reflecting the travel time of the light that is reaching our telescopes. Using these virtual observations, she looked at the so-called baryonic acoustic oscillation (BAO) feature (which provides a cosmologically important standard ruler) in the projected two-point correlation function of galaxies. Her results showed, that measuring these BAOs is a fairly tricky endeavour that can be significantly influenced by so-called cosmic variance effects – even when extremely large volumes are studied in observational surveys. While in simulations one can observe the modelled universe from different vantage points to recover the correct statistical ensemble average, this is unfortunately not readily possible for the real Universe. “The MillenniumTNG simulations are so big and contain so many galaxies, more than 1 billion in the biggest calculation, that it was really hard to study them”, says Monica Barrera. “Analysis scripts that work just fine for smaller simulations tend to take forever for MillenniumTNG.”
Analyzing cosmological data
Contacts:
Volker Springel
tel:2195
vspringel@mpa-garching.mpg.de
Hannelore Hämmerle
Press officer,
tel:3980
hanne@mpa-garching.mpg.de
More Information:
- Web site of the MillenniumTNG project
- Gauss Centre for Supercomputing
- SuperMUC-NG at the Leibniz Supercomputing Centre
Original publications:
MNRAS, July 2023
Source
MNRAS, July 2023
Source
MNRAS, submitted
Source
MNRAS, July 2023
Source
MNRAS, July 2023
Source
The MillenniumTNG Project: The large-scale clustering of galaxies
MNRAS, July 2023
Source
The MillenniumTNG Project: The impact of baryons and massive neutrinos on high-resolution weak gravitational lensing convergence maps