Professors Fabio Nobile and Riccardo Rossi explain why computer simulations are commonly used today in many fields of physics and engineering
The exceptional increase in computing power over the past decades has opened up unprecedented possibilities for using computer models and computer simulations as quantitative tools for the prediction and design of complex systems.
Computer simulations are commonly used nowadays in many fields of physics and engineering, for example, in the design of new molecules or new materials, to improve the aerodynamics of vehicles, to monitor the health of structures civilian, for weather forecasting, to name a few field applications.
These types of simulations are based on mathematical equations derived from physical principles such as mass, momentum, and conservation of energy and only find approximate solutions to these equations, the accuracy of which is related to spatial resolution and temporal at which the underlying physical quantities are described. Modern supercomputers allow us to achieve resolutions unimaginable just a few years ago. With that comes challenges, however, in being able to fully harness the computing power of these incredible machines. Another dimension of computer science that has emerged over the past two decades is the need to include the effects of uncertainties in computer models and what-if scenarios to make more robust predictions and designs.
This is precisely the purpose of the European H2020 âExaQUteâ (Exascale Quantification of Uncertainties for Technology and Science simulation) project, which involves several institutions across Europe. (1) The project targets applications in civil engineering and aims to quantify the effects of wind action on the mechanical response of high-rise buildings. Geometric parameters such as taper and torsion greatly influence the aerodynamic performance of buildings, but architectural considerations are also paramount for the “success” of a building, which implies that a consensus must be reached between the aesthetics. and technical needs.
Simulation tools are extremely useful for modeling the randomness of wind conditions (direction, intensity of wind gusts, etc.) by improving their shape and making them more resistant to extreme wind conditions.
Uncertainty quantification and robust design
Simulating the flow around a high-rise building with high resolution under strong wind conditions has a very high computational cost, in the order of hours of computation on a supercomputer. To properly quantify the uncertainty of loads on the structure due to the variability of wind conditions, it is necessary to run many simulations and explore many different wind scenarios, an approach which is commonly referred to as “Monte Carlo”.
A comprehensive high-resolution uncertainty quantification analysis, let alone the robust design optimization process, is out of reach even with modern computing resources and is a major challenge.
The key technology that has been exploited in the ExaQUte project to meet this challenge is to optimally combine simulations performed at different resolutions. The idea behind this is that simulations at lower resolutions can still provide useful information about the uncertainties in the load, although they are not sufficiently precise for design purposes. Therefore, most of the uncertainties can be captured by “cheap” low resolution simulations, with the result then corrected by running only a few “expensive” simulations at higher resolutions. This approach is known as Multilevel or Multifidelity Monte Carlo, and was achieved by developing an algorithm that gradually âlearnsâ from the results of already completed simulations, the optimal resolution levels to use as well as the optimal allocation of resources between these levels. In particular, low-resolution simulations can be achieved by aggressively reducing spatial resolution only in “uninteresting” regions (mesh adaptability), shortening the time horizon of the simulation, or even switching to simpler mathematical models that neglect certain physical processes.
Towards exascale computing
Modern supercomputers are collections of hundreds of thousands of interconnected CPUs (central processing units) and GPUs (graphics processing units) capable of performing up to 1015 operations per second (petascale computing). The next generation of supercomputers (expected in the next few years) will be up to 1000 times faster (exascale computing). This is a great opportunity for computational models and simulations and, at the same time, a great challenge as appropriate algorithms have to be developed, capable of efficiently exploiting the available computational resources (a field called “high performance computing” – HPC). The challenge lies in the difficulty of coordinating the calculations carried out by the different calculation units and of preventing some of them from remaining inactive while waiting for the others to complete their tasks.
A major effort has been devoted to the ExaQUte project to develop algorithms that fully exploit parallelism in multilevel / multifidelity Monte Carlo simulations for robust wind engineering designs, preparing them for the era of exascale computing.
Opportunities and challenges in wind engineering
The rapid increase in computing power, combined with recent and future advancements in computational methods and supercomputing for wind simulations and quantification of uncertainties, offers unparalleled capabilities with the potential to partially replace expensive wind tunnel experiments. and improve aerodynamic forecasting and wind engineering design.
(1) ExaQUte â(Exascal Quantification of Uncertainties for Technological and scientific simulation), grant agreement 800898, http://exaqute.eu/
ExaQUte has received funding from the European Union’s HORIZON 2020 research program under grant agreement n Â° 800898.
Attention: This is a commercial profile
Â© 2019. This work is under license CC-BY-NC-ND.