Abstract
Our ability to predict the structure and evolution of stars is in part limited by complex, 3D hydrodynamic processes such as convective boundary mixing. Hydrodynamic simulations help us understand the dynamics of stellar convection and convective boundaries. However, the codes used to compute such simulations are usually tested on extremely simple problems and the reliability and reproducibility of their predictions for turbulent flows is unclear. We define a test problem involving turbulent convection in a plane-parallel box, which leads to mass entrainment from, and internal-wave generation in, a stably stratified layer. We compare the outputs from the codes FLASH, MUSIC, PPMSTAR, PROMPI, and SLH, which have been widely employed to study hydrodynamic problems in stellar interiors. The convection is dominated by the largest scales that fit into the simulation box. All time-Averaged profiles of velocity components, fluctuation amplitudes, and fluxes of enthalpy and kinetic energy are within âà  ²3Ï of the mean of all simulations on a given grid (1283 and 2563 grid cells), where Ï describes the statistical variation due to the flowa s time dependence. They also agree well with a 5123 reference run. The 1283 and 2563 simulations agree within 9% and 4%, respectively, on the total mass entrained into the convective layer. The entrainment rate appears to be set by the amount of energy that can be converted to work in our setup and details of the small-scale flows in the boundary layer seem to be largely irrelevant. Our results lend credence to hydrodynamic simulations of flows in stellar interiors. We provide in electronic form all outputs of our simulations as well as all information needed to reproduce or extend our study.
Original language | English (US) |
---|---|
Article number | A193 |
Journal | Astronomy and Astrophysics |
Volume | 659 |
DOIs | |
State | Published - Mar 1 2022 |
Bibliographical note
Funding Information:grant ST/K003267/1, and Durham University. DiRAC is part of the National E Infrastructure. S. W. C. acknowledges federal funding from the Australian Research Council through a Future Fellowship (FT160100046) and Discovery Project (DP190102431). This work was supported by computational resources provided by the Australian Government through NCI via the National Computational Merit Allocation Scheme (project ew6), and resources provided by the Pawsey Supercomputing Centre which is funded by Australian Government and the Government of Western Australia. We thank the anonymous referee for constructive comments that improved this paper.
Funding Information:
Acknowledgements. P. V. F. E. was supported by the US Department of Energy through the Los Alamos National Laboratory (LANL). LANL is operated by Triad National Security, LLC, for the National Nuclear Security Administration of the US Department of Energy (Contract No. 89233218CNA000001). This work has been assigned a document release number LA-UR-21-25840. R. A., J. H., L. H., G. L., and F. K. R. acknowledge support by the Klaus Tschira Foundation. The work of F. K. R. is supported by the German Research Foundation (DFG) through the grant RO 3676/3-1. This work is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy EXC 2181/1 – 390900948 (the Heidelberg STRUCTURES Excellence Cluster). The authors gratefully acknowledge the Gauss Centre for Supercomputing e.V. (www.gauss-centre.eu) for funding this project by providing computing time through the John von Neumann Institute for Computing (NIC) on the GCS Supercomputer JUWELS (Jülich Supercomputing Centre 2019) at Jülich Supercomputing Centre (JSC). F. H.
Funding Information:
acknowledges funding through an NSERC Discovery Grant. This work has benefitted from and motivated in part by research performed as part of the JINA Center for the Evolution of the Elements (NSF Grant No. PHY-1430152). P. R. W. acknowledges support from NSF grants 1814181, 2032010, and PHY-1430152, as well as computing support through NSF’s Frontera computing system at TACC. The software used in this work was in part developed by the DOE NNSA-ASC OASCR FLASH Center at the University of Chicago. This work is partly supported by the ERC grant No. 787361-COBOM and the consolidated STFC grant ST/R000395/1. The authors would like to acknowledge the use of the University of Exeter High-Performance Computing (HPC) facility ISCA. DiRAC Data Intensive service at Leicester, operated by the University of Leicester IT Services, which forms part of the STFC DiRAC HPC Facility. The equipment was funded by BEIS capital funding via STFC capital grants ST/K000373/1 and ST/R002363/1 and STFC DiRAC Operations grant ST/R001014/1. R. H. acknowledges support from the World Premier International Research Centre Initiative (WPI Initiative), MEXT, Japan and the IReNA AccelNet Network of Networks, supported by the National Science Foundation under Grant No. OISE-1927130. This article is based upon work from the ChETEC COST Action (CA16117), supported by COST (European Cooperation in Science and Technology). This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101008324 (ChETEC-INFRA). This work used the DiRAC@Durham facility managed by the Institute for Computational Cosmology on behalf of the STFC DiRAC HPC Facility (www.dirac.ac.uk). The equipment was funded by BEIS capital funding via STFC capital grants ST/P002293/1 and ST/R002371/1, Durham University, and STFC operations grant ST/R000832/1. This work also used the DiRAC Data Centric system at Durham University, operated by the Institute for Computational Cosmology on behalf of the STFC DiRAC HPC Facility. This equipment was funded by BIS National E Infrastructure capital grant ST/K00042X/1, STFC capital grants ST/H008519/1 and ST/K00087X/1, STFC DiRAC Operations
Publisher Copyright:
©
Keywords
- Convection
- Hydrodynamics
- Methods: numerical
- Stars: interiors
- Turbulence