New AI Application Eliminates Major Barrier in Astrophysics

A bit of machine learning magic can now allow astrophysicists to replicate complex and vast universes in a thousandth of the time it takes with traditional techniques.

Simulations of a region of space 100 million light-years square. The leftmost simulation ran at low resolution. Using machine learning, researchers upscaled the low-resolution model to create a high-resolution simulation (right). That simulation captures the same details as a conventional high-resolution model (middle) while requiring significantly fewer computational resources. Image Credit: Y. Li et al./Proceedings of the National Academy of Sciences, 2021.

This latest technique will allow scientists to bring in a new epoch in high-resolution cosmological simulations, reported the inventors in a study recently published online in the Proceedings of the National Academy of Sciences on May 4th, 2021.

At the moment, constraints on computation time usually mean we cannot simulate the universe at both high resolution and large volume. With our new technique, it’s possible to have both efficiently. In the future, these AI-based methods will become the norm for certain applications.

Yin Li, Study Lead Author and Astrophysicist, Flatiron Institute

Designed by Li and his collaborators, the new technique feeds a machine learning algorithm with versions of a tiny area of space at both high and low resolutions. This machine learning algorithm learns how to upscale the low-resolution versions to correspond with the detail found in the high-resolution models.

Once the code is trained, it can take full-scale low-resolution versions and create “super-resolution” simulations that contain up to 512 times as many particles. The process is similar to capturing a blurred photograph and re-introducing the missing details, rendering it clear and sharp. This upscaling process saves a considerable amount of time.

For an area in the universe measuring approximately 500 million light-years across and comprising 134 million particles, present-day techniques would need 560 hours to create a high-resolution simulation through a single processing core. But with the latest technique, the team takes just 36 minutes.

When more particles were introduced to the simulation, the results were even more incredible. For a universe that is 1000 times as massive with 134 billion particles, the new technique designed by the researchers took 16 hours on just one graphics processing unit. Present-day techniques would take so long that it would be pointless to run them without specialized supercomputing resources, added Li.

Li is a joint research fellow from the Center for Computational Astrophysics and the Center for Computational Mathematics at Flatiron Institute. He co-authored the research work along with Yueying Ni, Rupert Croft, and Tiziana Di Matteo from Carnegie Mellon University; Simeon Bird from the University of California, Riverside; and Yu Feng from the University of California, Berkeley.

Cosmological simulations are crucial for astrophysics. Such simulations are used by scientists to estimate how the universe would appear in different situations, for example, whether the dark energy that pulls the universe apart differed over time.

Telescope observations may subsequently confirm whether the predictions of these simulations correspond with reality. To make testable predictions, simulations had to be run scores of times, and hence, faster modeling would be a huge advantage for the field.

Decreasing the time it takes to perform cosmological simulations “holds the potential of providing major advances in numerical cosmology and astrophysics. Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes,” stated Di Matteo.

To date, the new simulations only contemplate the force of gravity and dark matter. Although this may appear to be an oversimplification, gravity is, by far, the dominant force of the universe at large scales with dark matter constituting up to 85% of all the “stuff” in the cosmos. Within the simulation, the particles are not literally dark matter particles but are rather utilized as trackers to demonstrate how bits of dark matter travel across the universe.

The researchers’ code utilized neural networks to estimate how gravity would shift dark matter around over time. These networks ingest training information and use the data to run calculations. The outcomes are subsequently compared to the predicted result. With additional training, the neural networks adapt and turn out to be more precise.

The team used a specific method, known as a generative adversarial network, that pits a pair of two neural networks against one another. While one network takes low-resolution simulations of the universe and applies them to create high-resolution versions, the other network attempts to distinguish these simulations from the ones made by traditional techniques.

Both neural networks become increasingly better over time, until the simulation generator eventually wins out and produces rapid simulations that appear similar to the slow traditional ones.

We couldn’t get it to work for two years, and suddenly it started working. We got beautiful results that matched what we expected. We even did some blind tests ourselves, and most of us couldn’t tell which one was ‘real’ and which one was ‘fake’.

Yin Li, Study Lead Author and Astrophysicist, Flatiron Institute

In spite of being trained using small regions of space, these neural networks precisely simulated the large-scale structures that emerge only in massive simulations.

However, these simulations do not capture everything and, since they focus only on gravity and dark matter, smaller-scale phenomena—like supernovae, star formation, and the impacts of black holes—are left out.

The team has planned to extend their techniques to include the forces accountable for these phenomena, and to run their neural networks “on the fly” along with traditional simulations to enhance precision.

We don’t know exactly how to do that yet, but we’re making progress.

Yin Li, Study Lead Author and Astrophysicist, Flatiron Institute

Journal Reference:

Li, Y., et al. (2021) AI-assisted superresolution cosmological simulations. Proceedings of the National Academy of Sciences. doi.org/10.1073/pnas.2022038118.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.