Science

World’s 2nd Fastest Supercomputer Runs Largest-Ever Simulation of Universe

World's 2nd Fastest Supercomputer Runs Largest-Ever Simulation of Universe

he world’s second-fastest supercomputer, Frontier, has successfully carried out the most extensive simulation of the universe ever created, as per reports. The project, led by Salman Habib, Director of the Computational Science Division at Argonne National Laboratory, was undertaken to test models of cosmological hydrodynamics. The simulation was developed using the Hardware/Hybrid Accelerated Cosmology Code (HACC), which has been adapted for use on some of the most advanced supercomputers available.

As per the information shared by AMD in a press release, the Frontier is capable of processing up to 1.1 exaFLOPS, equating to 1.1 quintillion operations per second. The system integrates 9,472 AMD CPUs and 37,888 AMD GPUs, making it one of the most advanced machines globally. Reports indicate that this capability was surpassed only recently by another supercomputer, El Capitan, which achieved a processing speed of 1.742 exaFLOPS at Lawrence Livermore National Laboratory.

Development of Cosmological Simulations

The HACC code, which was originally developed over a decade ago, simulates the evolution of the universe. It has previously been deployed on less powerful systems like Titan and Summit, where the simulations primarily focused on gravitational forces. However, Frontier enabled the inclusion of additional factors such as hot gas, star formation, and black hole activity. Bronson Messer, Science Director at the Oak Ridge Leadership Computing Facility, remarked in a statement that the inclusion of baryons and dynamic physics marked a significant advancement in the realism of these simulations.

Applications and Scientific Implications

As per reports, the simulations will be made available to the scientific community to test and refine cosmological models. These include questions surrounding dark matter, dark energy, and alternative theories of gravity. The research aligns with the Department of Energy’s ExaSky project, a $1.8 billion initiative supporting exascale computing for astrophysical research.

Reportedly, the study’s findings, it is anticipated, will be compared with data from large-scale astronomical surveys, such as those conducted by the Vera C. Rubin Observatory, to identify the models that best align with observable phenomena.