Smoothing out our notion of turbulence

Smoothing out our notion of turbulence

Speed read
  • Turbulence is present in most fluids and relevant in many environmental and industrial areas
  • Simulations of turbulent flows models generate an excess of data
  • Improved computational speed and reduced cost give researchers freedom to experiment

Turbulence is what makes you clutch your seat and contemplate your mortality 36,000 feet above the ground as the flimsy tin can you’re flying in bounces violently.

Turbulence is also present in ocean currents and the lava discharged from a volcano. It’s active in the smoke from a chimney, oil churning through pipelines and, yes, the air around aircraft wings.

Turbulence in five dimensions. The MinoTauro cluster at the Barcelona Supercomputing Center (BSC) helped scientists model the evolution of eddies containing energy at four different scales. Courtesy José Cardesa.

In fact, the flow of most fluids is turbulent, including the movement of blood in our arteries.

Since turbulence is so prevalent, its study has many industrial and environmental applications.

Scientists model turbulence to improve vehicle design, diagnose atherosclerosis, build safer bridges, and reduce air pollution.

Caused by excess energy that can’t be absorbed by a fluid’s viscosity, turbulent flow is by nature irregular and therefore hard to predict. The speed of the fluid at any point constantly fluctuates in both magnitude and direction, presenting researchers with a long-standing challenge.

The reduced cost of technology has made it possible for us to play with these datasets ~ José Cardesa

But new research has validated a theory proposed in the early twentieth-century, but whose math was too complex to confirm until recently.

The need to follow fluid lumps in time, space, and scale results in equations that generate too much information: Even now, only a small part of the flow will fit in a computer simulation.

<strong> José Cardesa</strong> turned to the Barcelona Supercomputing Center (BSC) to create accurate models of turbulent flow. Pictured here is the Mare Nostrum 4 at the BSC, allocated via the <a href= 'http://www.prace-ri.eu'>Partnership for Advanced Computing in Europe (PRACE) </a>. Courtesy BSC.

Scientists use models to make up the missing part. But if those models are wrong, then the simulation is also wrong and no longer represents the flow it’s attempting to simulate.

Recent research by José Cardesa, an aeronautical engineer in Javier Jiménez’s Fluid Dynamics Group at Universidad Politécnica de Madrid (UPM), attempts to gain new insights into the physics behind turbulent flows and reduce the gaps between simulated flows and the flows around real devices.

“A main source of discrepancy between computer-modeled flows and the flow around a real airplane is given by the poor performance of the models,” says Cardesa.

An underlying simplicity

In 1940s, mathematician Andrey Kolmogorov proposed that turbulence occurs in a cascade.

A turbulent flow contains whirls of many different sizes. According to Kolmogorov, energy is transferred from the large whirls to smaller and more numerous whirls, rather than dispersing to farther distances.

But, Cardesa says, the chaotic behavior of a fluid makes it hard to observe any trend with the naked eye.

Hoping to track individual eddy structures and determine if a recurrent behavior is at work in how turbulence spreads, Cardesa and his colleagues at UPM simulated a turbulent flow using the MinoTauro cluster at the Barcelona Supercomputing Center.

The code was run in parallel on 32 NVIDIA Tesla M2090 cards, using a hybrid CUDA-MPI code developed by Alberto Vela-Martin.  The simulation took almost three months to complete and resulted in over one hundred terabytes of compressed data.

<strong>Turbulence</strong> is all around us: It's in the air we breathe, the water that flows by our cities, and in the magma that moves the earth. Courtesy USGS; Ninara; Pontla. <a href== 'https://creativecommons.org/licenses/by-nc-nd/2.0/legalcode'>CC BY-NC-ND 2.0</a> <a href= 'https://creativecommons.org/licenses/by/2.0/legalcode'> CC BY 2.0 </a>.

Progress in analyzing the stored simulation data was initially slow, until Cardesa adjusted the code so it would fit on a single node of a computer cluster with 48 GB of RAM per node. This way, he could run the process independently on twelve different nodes and was able to complete the task within just one month.

Their results validated Kolmogorov’s theory, revealing an underlying simplicity in the apparently random motion of turbulent wind or water. The next step may be to try to understand the cause of the trend Cardesa has detected or to implement the new insights into flow simulation software.

Cardesa’s work has benefited from advances in computational speed and storage capacity. He points out that his work would have been possible about ten years ago, but the expense would have been such that it would have required a ‘heroic’ computational effort.

“The reduced cost of technology has made it possible for us to play with these datasets,” says Cardesa. “This is an extremely useful situation to be in when doing fundamental research and throwing all our efforts at an unsolved problem.”

This article was originally published on ScienceNode.org. Read the original article.

Share: Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+Email this to someone