Simulation of the Gravitational Mixing on GPU

Cover Page


Gravitational mixing induced by the Rayleigh-Taylor instability, arises in the system of two fluids/gases with different densities when the acceleration acts from a more dense material to the less dense one. In this case, the amplitude of small perturbations of the contact boundary increases over time, involving new flow regions into the mixing (Rayleigh, Proc. Of the London Math. Soc., 14, 1883; Taylor GI, Proc. Of the R. Soc. of London, A201, 1950). The numerical calculation of such problems requires the use of methods that can fully describe the discontinuous nature of the flow variables. One of such methods is the Godunov’s one (Godunov SK, Mat. Sb. (NS), 47 (89), 3, 1959), which is widely used and based on solving the Riemann problem for further calculation of the fluxes through the edges of cells. At the same time, we know that the exact solution of the Riemann problem is quite expensive in terms of computing resources. However, when using massively parallel architectures such as GPU, significant acceleration can be achieved due to the large number of computational processes which allows to perform calculations much faster. As part of the performed work two versions of a parallel algorithm were implemented for the calculation of mixing. The estimation of efficiency and speed up was made.

About the authors

P A Kuchugov

Keldysh Institute of Applied Mathematics


N D Shuvalov

Moscow Institute of Physics and Technology


A M Kazennov

Moscow Institute of Physics and Technology





Abstract - 581

PDF (Russian) - 104


Copyright (c) 2014 Кучугов П.А., Шувалов Н.Д., Казённов А.М.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies