A universe evolves billions of years into billions of years, but researchers have developed a way to create a complex simulated universe in less than a day. The technique, published in this week’s Proceedings of the National Academy of Sciences, brings machine learning, high-performance computing and astrophysics together and will help usher in a new era of high-resolution cosmology simulation.
Cosmological simulations are an essential part of unraveling the many mysteries of the universe, including Dark Matter and Dark Energy. But until now, researchers had faced the common puzzle of not being able to do all of this – simulations could focus on a small area at high resolution, or they could involve a large volume of the universe at low resolution Could.
Carnegie Mellon University Physics Professor Tiziana Di Mateo and Rupert Croft, Flatiron Institute Research Fellow Yin Li, Carnegie Mellon Ph.D.
Candidates Yuying Ni, University of California Riverside Physics and Astronomy Professor Simeon Bird and University of California Berkeley’s Yu Feng address this problem by teaching machine learning algorithms to upgrade a machine learning algorithm based on neural networks from low resolution to super resolution. put away.
“Cosmological simulations require a large amount of cover for cosmological studies, while high resolution is also needed to solve small-scale galaxy formation physics, which will pose daunting computational challenges.
Our technology Can be used as a powerful and promising tool to match those two requirements together by modeling small-scale galaxy formation physics in large cosmological versions, “said Nee, who trained the model, Built pipelines for testing and validation, analyzing data and performing visualizations from data.
Trained code can take a full-scale, low-resolution model and generate super-resolution simulations that contain 512 times more particles. For an area of approximately 500 million light-years in a universe with 134 million particles, current methods would require 560 hours to churn out high-resolution simulations using a single processing core. With the new approach, researchers need only 36 minutes.
The results were even more dramatic when more particles were added to the simulation. For a 1,000 times larger universe with 134 billion particles, the researchers’ new method took 16 hours on a single graphics processing unit. Using current methods, a dedicated supercomputer will take months to complete a simulation of this size and resolution.
Di Mateo said “reducing the time it takes to run cosmological simulations” has the potential to provide major advances in numerical cosmology and astrophysics. “Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes.”
Scientists use cosmological simulations to predict what the universe will look like in different scenarios, such as the dark energy that separates the universe, changing over time. Telescope observations then confirm whether simulation predictions match reality.
“With our previous simulations, we showed that we could simulate the universe to discover new and interesting physics, but only on small or low-resolution scales,” Croft said. “By incorporating machine learning, technology is able to capture our ideas.”
Di Mateo, Croft and Nee are part of Carnegie Mellon’s National Science Foundation (NSF) Planning Institute for Artificial Intelligence in Physics, which supported this work, and a member of Carnegie Mellon’s McWilliams Center for Cosmology.
“The universe is the largest data set — artificial intelligence is the key to understanding the universe and revealing new physics,” said Scott Dodelson, professor and head of the physics department at Carnegie Mellon University and director of the NSF Planning Institute.
“This research shows how the NSF Planning Institute for Artificial Intelligence will advance physics through artificial intelligence, machine learning, statistics and data science.”
“It is clear that AI is having a major impact on many areas of science, including physics and astronomy,” said James Shank, a program director in the physics department of NSF. “Our AI Planning Institute program is working to accelerate discovery to AI. This new result is a good example of how AI is transforming cosmology.”
To create their new method, Ni and Li used these fields to create a code that predicts using neural networks how gravity transfers dark matter over time. Networks take training data, run calculations and compare the results to the expected results. With further training, networks become optimized and become more accurate.
The specific approach used by the researchers, called generative adversarial network, is the two neural networks being pitted against each other.