Credit: Colby Earles, ORNL Hi-res image
At the heart of some of the smallest and densest stars in the universe lies nuclear matter that might exist in never-before-observed exotic phases. Neutron stars, which form when the cores of massive stars collapse in a luminous supernova explosion, are thought to contain matter at energies greater than what can be achieved in particle accelerator experiments, such as the ones at the Large Hadron Collider and the Relativistic Heavy Ion Collider.
Although scientists cannot recreate these extreme conditions on Earth, they can use neutron stars as ready-made laboratories to better understand exotic matter. Simulating neutron stars, many of which are only 12.5 miles in diameter but boast around 1.4 to two times the mass of our sun, can provide insight into the matter that might exist in their interiors and give clues as to how it behaves at such densities.
A team of nuclear astrophysicists led by Michael Zingale at Stony Brook University is using the Oak Ridge Leadership Computing Facility's (OLCF's) IBM AC922 Summit, the nation's fastest supercomputer, to model a neutron star phenomenon called an X-ray burst—a thermonuclear explosion that occurs on the surface of a neutron star when its gravitational field pulls a sufficiently large amount of matter off a nearby star. Now, the team has modeled a 2D X-ray burst flame moving across the surface of a neutron star to determine how the flame acts under different conditions. Simulating this astrophysical phenomenon provides scientists with data that can help them better measure the radii of neutron stars, a value that is crucial to studying the physics in the interior of neutron stars. The results were published in the Astrophysical Journal.
"Astronomers can use X-ray bursts to measure the radius of a neutron star, which is a challenge because it's so small," Zingale said. "If we know the radius, we can determine a neutron star's properties and understand the matter that lives at its center. Our simulations will help connect the physics of the X-ray burst flame burning to observations."
The group found that different initial models and physics led to different results. In the next phase of the project, the team plans to run one large 3D simulation based on the results from the study to obtain a more accurate picture of the X-ray burst phenomenon.
Switching physics
Neutron star simulations require a massive amount of physics input and therefore a massive amount of computing power. Even on Summit, researchers can only afford to model a small portion of the neutron star surface.
To accurately understand the flame's behavior, Zingale's team used Summit to model the flame for various features of the underlying neutron star. The team's simulations were completed under an allocation of computing time under the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. The team varied surface temperatures and rotation rates, using these as proxies for different accretion rates—or how quickly the star increases in mass as it accumulates additional matter from a nearby star.
Alice Harpole, a postdoctoral researcher at Stony Brook University and lead author on the paper, suggested that the team model a hotter crust, leading to unexpected results.
"One of the most exciting results from this project was what we saw when we varied the temperature of the crust in our simulations," Harpole said. "In our previous work, we used a cooler crust. I thought it might make a difference to use a hotter crust, but actually seeing the difference that the increased temperature produced was very interesting."
Massive computing, more complexity
The team modeled the X-ray burst flame phenomenon on the OLCF's Summit at the US Department of Energy's (DOE's) Oak Ridge National Laboratory (ORNL). Nicole Ford, an intern in the Science Undergraduate Laboratory Internship Program at Lawrence Berkeley National Laboratory (LBNL), ran complementary simulations on the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC). The OLCF and NERSC are a DOE Office of Science user facilities located at ORNL and LBNL, respectively.
With simulations of 9,216 grid cells in the horizontal direction and 1,536 cells in the vertical direction, the effort required a massive amount of computing power. After the team completed the simulations, team members tapped the OLCF's Rhea system to analyze and plot their results.
On Summit, the team used the Castro code—which is capable of modeling explosive astrophysical phenomena—in the adaptive mesh refinement for the exascale (AMReX) library, which allowed team members to achieve varying resolutions at different parts of the grid. AMReX is one of the libraries being developed by the Exascale Computing Project, an effort to adapt scientific applications to run on DOE's upcoming exascale systems, including the OLCF's Frontier. Exascale systems will be capable of computing in the exaflops range, or 1018 calculations per second.
AMReX provides a framework for parallelization on supercomputers, but Castro wasn't always capable of taking advantage of the GPUs that make Summit so attractive for scientific research. The team attended OLCF-hosted hackathons at Brookhaven National Laboratory and ORNL to get help with porting the code to Summit's GPUs.
"The hackathons were incredibly useful to us in understanding how we could leverage Summit's GPUs for this effort," Zingale said. "When we transitioned from CPUs to GPUs, our code ran 10 times faster. This allowed us to make less approximations and perform more physically realistic and longer simulations."
The team said that the upcoming 3D simulation they plan to run will not only require GPUs—it will eat up nearly all of the team's INCITE time for the entire year.
"We need to get every ounce of performance we can," Zingale said. "Luckily, we have learned from these 2D simulations what we need to do for our 3D simulation, so we are prepared for our next big endeavor."
by Rachel McDowell, Oak Ridge National Laboratory
Provided by Oak Ridge National Laboratory
Source: Phys.Org/News