During the development of blood-contacting medical devices such as heart support pumps, the estimation of their blood damage potential is a key criterion. Reynolds Averaged Navier Stokes (RANS) models are often used as the basis for modeling blood damage in turbulent flows. To predict blood damage by turbulence-induced stresses that are not resolved in RANS, a stress formulation that represents the corresponding scales is required. Here, we compare two commonly employed stress formulations: a scalar stress representation that uses Reynolds stresses as a surrogate for unresolved fluid stresses, and an effective stress formulation based on energy dissipation.
We conducted simulations of the CentriMag blood pump using unsteady RANS models with three different closure models and a Large Eddy Simulation (LES) for reference. We implemented both stress representations in all models and compared the resulting total stress distributions in Eulerian and Lagrangian frameworks. The Reynolds-stress-based approach significantly overestimated the contribution of unresolved stresses in RANS, with differences between closure models reaching several orders of magnitude. With the dissipation-based approach, the total stresses predicted with RANS deviated by about 50% from the LES reference, making it substantially more accurate than the Reynolds-stress-based approach. Compared to only considering resolved stresses, the dissipation-based approach compensates for the underprediction of viscous stresses, particularly in the core flow, and offers an overall improved accuracy of the total stress estimations.
Our results suggest that the Reynolds-stress-based formulation is unreliable for estimating scalar stresses in RANS simulations, while the dissipation-based approach provides an accuracy improvement over simply neglecting unresolved stresses. We conclude that the dissipation-based inclusion of unresolved stresses should be the preferred choice for blood damage modeling with RANS.