On the universality of supersonic turbulence
...or „The world's largest supersonic turbulence simulation“
Federrath, C., 2013, Monthly Notices of the Royal Astronomical Society, 436, 1245 [ ADS link ] [ PDF ]
Compressible turbulence shapes the structure of the interstellar medium of our Galaxy and likely plays an important role also during structure formation in the early Universe. The density PDF and the power spectrum of such compressible, supersonic turbulence are the key ingredients for theories of star formation. However, both the PDF and the spectrum are still a matter of debate, because theoretical predictions are limited and simulations of supersonic turbulence require enormous resolutions to capture the inertial-range scaling. To advance our limited knowledge of compressible turbulence, we here present and analyse the world's largest simulations of supersonic turbulence. We compare hydrodynamic models with numerical resolutions of 256^3-4096^3 mesh points and with two distinct driving mechanisms, solenoidal (divergence-free) driving and compressive (curl-free) driving. We find convergence of the density PDF, with compressive driving exhibiting a much wider and more intermittent density distribution than solenoidal driving. Analysing the power spectrum of the turbulence, we find a pure velocity scaling close to Burgers turbulence with P(v)~k^(-2) for both driving modes in our hydrodynamical simulations with Mach = 17. The spectrum of the density-weighted velocity rho^(1/3)v, however, does not provide the previously suggested universal scaling for supersonic turbulence. We find that the power spectrum P(rho^(1/3)v) scales with wavenumber as k^(-1.74) for solenoidal driving, close to incompressible Kolmogorov turbulence, k^(-5/3), but is significantly steeper with k^(-2.10) for compressive driving. We show that this is consistent with a recent theoretical model for compressible turbulence that predicts P(rho^(1/3)v)~k^(-19/9) in the presence of a strong div(v) component as is produced by compressive driving and remains remarkably constant throughout the supersonic turbulent cascade.
This paper has been selected as the SAO/NASA ADS article of the year 2013.
Mach 17 turbulence with 40963 grid cells
Shows a gas density slice in Mach 17 turbulence with solenoidal driving (left) and compressive driving (right).
[ Federrath_4096_dens_slice_xy.mp4, 26MB high-res mp4 ]
Shows a vorticity slice in Mach 17 turbulence with solenoidal driving (left) and compressive driving (right).
[ Federrath_4096_vort_slice_xy.mp4, 122MB high-res mp4 ]
Shows the projected gas density in Mach 17 turbulence with solenoidal driving (left) and compressive driving (right).
[ Federrath_4096_coldens_xy.mp4, 29MB high-res mp4 ]
Shows the projected vorticity in Mach 17 turbulence with solenoidal driving (left) and compressive driving (right).
[ Federrath_4096_colvort_xy.mp4, 87MB high-res mp4 ]
Each of the two simulations was run with a numerical grid resolution of 40963 points, which is currently the world's largest data set of supersonic turbulence (an equivalent resolution was so far only reached for incompressible turbulence by Kaneda et al. 2003). Each simulation was run for about 44,000 time steps on 32,768 compute cores running in parallel on GCS HPC-System SuperMUC at the Leibniz Rechenzentrum in Garching (which consumed about 7.2 million CPU hours altogether). Each run produced 115TB of data. For access to the simulation data, please contact the author.
C.F. thanks Hussein Aluie, Supratik Banerjee, Sebastien Galtier, Ralf Klessen, Lukas Konstandin, Alexei Kritsuk, Paolo Padoan, Alessandro Romeo, and Rahul Shetty for stimulating discussions. An independent Bayesian analysis of the Fourier spectra by Lukas Konstandin is greatly appreciated. This work was supported by a Discovery Projects Fellowship from the Australian Research Council (grant DP110102191). The simulations consumed computing time on SuperMUC at the Leibniz Rechenzentrum (grant pr32lo) and on JUROPA at the Forschungszentrum Jülich (grant hhd20). The Gauss Centre for Supercomputing is gratefully acknowledged. The software used here was in part developed by the DOE-supported ASC/Alliance Center for Astrophysical Thermonuclear Flashes at the University of Chicago.