# Presentation

A Framework for Massively Parallel Simulation of FSI Problems

SessionPhD Forum Posters

Event Type

PhD Forum

Pre-Recorded

TimeMonday, June 22nd7pm - 7:37pm

LocationApplaus

DescriptionWe present a massively parallel partitioned simulation environment for fluid-structure-interaction (FSI) problems. In a partitioned approach, the FSI problem is divided into two sub-problems and each of them is solved individually. A coupling method must be used to account for the interaction of the domains. This approach provides great flexibility and robustness as it allows using well-adapted single physics solvers and numerical methods for each sub-problem. However, it introduces new challenges that must be addressed carefully to achieve stable and accurate numerical results and maintain the scalability of the solvers for the coupled problem. For instance, a coupling tool is required to take care of the inter-solver data communication, boundary data mapping and acceleration of equation coupling at the interface. In addition, an inter-solver load balancing technique is necessary to efficiently distribute the computational resources (e.g., CPUs) among the solvers.

In this work, the fluid and structural equations are solved using two instances of TermoFluids [1] which is an in-house parallel solver. For fluid flow, the Arbitrary Lagrangian-Eulerian form of the Navier-Stokes equations on an unstructured grid is solved using a finite-volume discretization and second-order numerical schemes. For the structural problem, the nonlinear elastodynamics equations are solved on an unstructured grid using a finite-volume method and second-order numerical schemes. A semi-implicit FSI coupling strategy is followed which segregates the fluid pressure term and couples it strongly to the structure, while the remaining fluid terms and the geometrical nonlinearities are only loosely coupled.

For the code coupling, the preCICE library [2] is used. preCICE uses a fully parallel point-to-point communication scheme to exchange data between the solvers. It is also equiped with robust and advanced quasi-Newton methods for coupling iterations between the solvers.

In addition, a machine learning based scheme is used for inter-solver load balancing. We use machine learning to train a performance model for each solver, which is used later in an integer optimization problem to calculate the optimum core distribution between the solvers [3]. This method minimizes the solvers' waiting time and can significantly increase the performance of the framework.

In summary, our main contributions are combining the semi-implicit coupling strategy with an advanced quasi-Newton method, adaptive inter-code load-balancing, and the demonstration of the efficiency of these two aspects for a large-scale real-world application. To demonstrate the parallel performance of the coupled framework, strong scalability measurements are presented for a patient specific aorta test case. The measurements show a very good scalability up to 11,520 cores with parallel efficiency of 85%.

References:

[1] O. Lehmkuhl, et al. "A new parallel unstructured CFD code for the simulation of turbulent industrial problems on low cost PC cluster", in: Parallel Computational Fluid Dynamics 2007, Springer, 2009, pp. 275–282 (2009).

[2] Bungartz, Hans-Joachim, et al. "preCICE a fully parallel library for multi-physics surface coupling." Computers and Fluids 141 (2016).

[3] Totounferoush, Amin, et al. "A new load balancing approach for coupled multi-physics simulations." 2019 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW). IEEE, 2019.

In this work, the fluid and structural equations are solved using two instances of TermoFluids [1] which is an in-house parallel solver. For fluid flow, the Arbitrary Lagrangian-Eulerian form of the Navier-Stokes equations on an unstructured grid is solved using a finite-volume discretization and second-order numerical schemes. For the structural problem, the nonlinear elastodynamics equations are solved on an unstructured grid using a finite-volume method and second-order numerical schemes. A semi-implicit FSI coupling strategy is followed which segregates the fluid pressure term and couples it strongly to the structure, while the remaining fluid terms and the geometrical nonlinearities are only loosely coupled.

For the code coupling, the preCICE library [2] is used. preCICE uses a fully parallel point-to-point communication scheme to exchange data between the solvers. It is also equiped with robust and advanced quasi-Newton methods for coupling iterations between the solvers.

In addition, a machine learning based scheme is used for inter-solver load balancing. We use machine learning to train a performance model for each solver, which is used later in an integer optimization problem to calculate the optimum core distribution between the solvers [3]. This method minimizes the solvers' waiting time and can significantly increase the performance of the framework.

In summary, our main contributions are combining the semi-implicit coupling strategy with an advanced quasi-Newton method, adaptive inter-code load-balancing, and the demonstration of the efficiency of these two aspects for a large-scale real-world application. To demonstrate the parallel performance of the coupled framework, strong scalability measurements are presented for a patient specific aorta test case. The measurements show a very good scalability up to 11,520 cores with parallel efficiency of 85%.

References:

[1] O. Lehmkuhl, et al. "A new parallel unstructured CFD code for the simulation of turbulent industrial problems on low cost PC cluster", in: Parallel Computational Fluid Dynamics 2007, Springer, 2009, pp. 275–282 (2009).

[2] Bungartz, Hans-Joachim, et al. "preCICE a fully parallel library for multi-physics surface coupling." Computers and Fluids 141 (2016).

[3] Totounferoush, Amin, et al. "A new load balancing approach for coupled multi-physics simulations." 2019 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW). IEEE, 2019.

Poster PDF

Not yet registered? Event registration is available here .