Posted in | News | Quantum Physics

AbacusSummit Simulations Will Help Extract Futuristic Data About the Universe

A recently launched set of cosmological simulations is undoubtedly the biggest ever generated, with approximately 60 trillion particles clocked in.

The AbacusSummit suite comprises hundreds of simulations of how gravity has shaped the distribution of dark matter throughout the universe. Here, a snapshot of one of the simulations is shown at various zoom scales: 10 billion light-years across, 1.2 billion light-years across and 100 million light-years across. The simulation replicates the large-scale structures of our universe, such as the cosmic web and colossal clusters of galaxies. Image Credit: The AbacusSummit Team; layout and design by Lucy Reading-Ikkanda/Simons Foundation.

The simulation suite, called AbacusSummit, will be influential in mining secrets of the universe from impending surveys of the cosmos, its inventors forecast. They showcase AbacusSummit in many papers published in the October 25 issue of Monthly Notices of the Royal Astronomical Society.

AbacusSummit was created by scientists at the Flatiron Institute’s Center for Computational Astrophysics (CCA) in New York City and the Center for Astrophysics | Harvard & Smithsonian. Composed of over 160 simulations, it represents how gravitational attraction causes particles in a box-shaped universe to travel around. Such representations, known as N-body simulations, record the behavior of dark matter, which makes up the majority of the universe’s material and interacts solely via gravity.

This suite is so big that it probably has more particles than all the other N-body simulations that have ever been run combined—though that’s a hard statement to be certain of.

Lehman Garrison, Study Lead Author (one of the new studies) and Research Fellow, Center for Computational Astrophysics, Flatiron Institute

Garrison headed the creation of the AbacusSummit simulations together with graduate student Nina Maksimova and astronomy professor Daniel Eisenstein, both of the Center for Astrophysics. The simulations were worked on the U.S. Department of Energy’s Summit supercomputer at the Oak Ridge Leadership Computing Facility in Tennessee.

Soon, AbacusSummit will be handy, as numerous surveys will generate maps of the cosmos with unparalleled detail in near future. These include the Nancy Grace Roman Space Telescope, the Dark Energy Spectroscopic Instrument, and the Euclid spacecraft. One of the objectives of these massive-budget missions is to enhance estimations of the astrophysical and cosmic parameters that establish how the universe acts and how it appears.

Researchers will come up with those enhanced estimates by comparing the new observations to computer simulations of the universe with diverse values for the different parameters—such as the characteristics of the dark energy pulling the universe apart. With the enhancements presented by the next-generation surveys comes the necessity for enhanced simulations, Garrison says.

The galaxy surveys are delivering tremendously detailed maps of the universe, and we need similarly ambitious simulations that cover a wide range of possible universes that we might live in. AbacusSummit is the first suite of such simulations that has the breadth and fidelity to compare to these amazing observations.

Lehman Garrison, Study Lead Author (one of the new studies) and Research Fellow, Center for Computational Astrophysics, Flatiron Institute

The project was overwhelming. N-body calculations—which try to work out the movements of objects, like planets, interacting gravitationally—have been among the primary challenges in the field of physics since the time of Isaac Newton. They are complicated as each object interacts with every other object, regardless of how far apart they are. This means that as more objects are added, the number of interactions quickly grows.

There is no common solution to the N-body issue for three or more enormous bodies. The calculations presented are just approximations. A typical method is to freeze time, compute the total force acting on each object, then prod each one based on the net force it undergoes. Time is then pushed forward marginally, and the process repeats.

Using that method, AbacusSummit tackled massive numbers of particles owing to smart code, a new numerical technique, and loads of computing power. The Summit supercomputer was the fastest in the world at the time the team worked the calculations.

The researchers engineered their codebase—called Abacus—to fully exploit Summit’s parallel processing power, whereby numerous calculations can be worked at the same time. Summit boasts of plenty of graphics processing units (GPUs) that perform excellently at parallel processing.

Working N-body calculations using parallel processing necessitates meticulous algorithm design because a whole simulation requires a sizable amount of memory to store. That means Abacus cannot merely make copies of the simulation for various nodes of the supercomputer to work on. Consequently, the code instead splits each simulation into a grid.

A preliminary calculation offers a reasonable approximation of the impacts of distant particles at any particular point in the simulation. (Distant particles have a much smaller role to play than adjacent particles.) Abacus then groups adjacent cells and divides them off so that the computer can work on each group self-reliantly, integrating the approximation of distant particles with precise calculations of adjacent particles.

For huge simulations, the team discovered that Abacus’ method provides a substantial enhancement on other N-body codebases, which split the simulations irregularly according to the particle distribution.

The even divisions used by AbacusSummit make proper use of parallel processing, the scientists explain. Furthermore, the consistency of Abacus’ grid method enables a huge amount of the distant-particle approximation to be calculated before the simulation even begins.

Owing to its design, Abacus can bring up to date 70 million particles per second per node of the Summit supercomputer (each particle models a clump of dark matter with 3 billion times the mass of the sun). The code can even examine a simulation as it is working, seeking patches of dark matter suggestive of the bright star-forming galaxies that are a focus of imminent surveys.

Our vision was to create this code to deliver the simulations that are needed for this particular new brand of galaxy survey. We wrote the code to do the simulations much faster and much more accurately than ever before.

Lehman Garrison, Study Lead Author (one of the new studies) and Research Fellow, Center for Computational Astrophysics, Flatiron Institute

Eisenstein, who is a member of the Dark Energy Spectroscopic Instrument partnership—which in recent times started its survey to map an exceptional fraction of the universe—expresses that he is keen to use Abacus in the future.

Cosmology is leaping forward because of the multidisciplinary fusion of spectacular observations and state-of-the-art computing. The coming decade promises to be a marvelous age in our study of the historical sweep of the universe.

Daniel Eisenstein, Astronomy Professor, Center for Astrophysics, Harvard & Smithsonian

The other co-creators of Abacus and AbacusSummit include Philip Pinto of the University of Arizona, Sihan Yuan of Stanford University, Sownak Bose of Durham University in England, and Center for Astrophysics researchers Thomas Satterthwaite, Boryana Hadzhiyska, and Douglas Ferrer. The simulations were worked on the Summit supercomputer under an Advanced Scientific Computing Research Leadership Computing Challenge allocation.

Journal Reference:

Maksimova, N.A., et al. (2021) ABACUSSUMMIT: a massive set of high-accuracy, high-resolution N-body simulations. Monthly Notices of the Royal Astronomical Society. doi.org/10.1093/mnras/stab2484.

Source: https://www.simonsfoundation.org

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Submit