Research News: Larios streamlines turbulence models

If you have ever looked up at the clouds and marveled at their seemingly unlimited complexity, or watched the cream in a cup of coffee spiral into ever more intricate patterns until it diffuses into a brownish haze, then you have glimpsed the strange and chaotic world of turbulence. Turbulence is nearly ubiquitous in science as well as everyday life, appearing in diverse areas such as weather prediction, aerospace technology, and blood flow in the heart. Understanding, quantifying, and predicting turbulence have become extremely important, yet elusive goals in science and engineering.

On a mathematical level, turbulent fluids (including air and gasses) are thought to be governed by the Navier-Stokes equations, a system of partial differential equations (PDEs) that is simple to write down, but is so difficult to work with that, in 2000, the Clay Mathematics Institute issued a $1 million prize to show mathematically whether the solutions remain physically realistic for all times. This problem has been open since the work of Jean Leray in 1932. Nevertheless, the effort to find accurate, practical tools for predicting turbulence continues. The research of Assistant Professor Adam Larios has helped to further this effort.

To describe Dr. Larios’ work, we need to understand the notion of a dynamical system. A dynamical system consists of a starting condition and some laws that describe how things change in time. Picture a group of dancers just before the dance starts, frozen in their starting positions. In a dynamical system, this is known as the initial condition. Suddenly, the music starts, the scene comes to life, and the dance progresses according to its internal choreography. This choreography is analogous to the governing laws of the dynamical system. The Navier-Stokes equations of fluids are a dynamical system governing the motion of fluids. Given the initial state of a fluid, the equations choreograph the motion of the fluid.

Imagine trying to predict how a particular dance would progress, given only the starting positions of the dancers, and a description of the choreography. For a slow simple dance, we might have some luck, but to bring the analogy closer to turbulence, imagine a fast, highly chaotic dance involving billions of dancers. Also imagine that the choreography is only a list of local rules for how dancers should interact with their nearest neighbors. Such a dance might take on wildly complex, large-scale patterns that would be nearly impossible to predict and may be highly dependent on starting positions. In turbulent fluids, the local rules are given by the Navier-Stokes equations, and the large-scale patterns are things like hurricanes, solar storms, or rapidly varying ocean currents.

Although the Navier-Stokes equations are hard, there is an even more fundamental difficulty: In real life, one typically does not fully know the initial condition. For example, in weather prediction, the current state of the weather is measured at locations spaced apart by roughly 1 kilometer on average, but the Navier-Stokes equations require initial data at every location in space, down to roughly millimeter scales. One option is to interpolate the data, e.g., by assuming the data vary linearly in between weather stations, but such a scheme introduces a tremendous amount of error into the initial condition, which then grows exponentially fast in time due to the system’s underling chaos. Such an idea was tried by mathematician Lewis Fry Richardson in 1922, in the first-ever attempt at numerical weather forecasting. Richardson’s careful calculations were incorrect by two orders of magnitude for a six-hour forecast.

To get around these difficulties, modern researchers use a class of techniques known as data assimilation. Data assimilation eliminates the need for complete initial data. It instead incorporates incoming data into simulations by asking the simulated solution to strike a balance between following the rules of the dynamical system, and staying close to the observed data. It is as if, in trying to predict the outcome of our billion chaotic dancers, we had livestreaming video cameras set up in several locations to get an idea of the current state of the dance as it progressed. This data could be fed into a computer running a simulation that knows the local rules, but was started with inaccurate initial data.

Classic data assimilation is based on a set of techniques known as the Kalman filter. However, the Kalman filter is computationally expensive to compute and limited in what it can be applied to. In a 2014 paper by A. Azouani, E. Olson, and E. Titi, a new approach was proposed (now called the AOT algorithm). The idea abandons the expensive statistical methods of the Kalman filter and instead uses a feedback-control term at the PDE level. This new approach is far less expensive and was mathematically proven to force the simulation to converge to the true solution exponentially quickly in time. This paper set off a storm in the research world, with over 30 papers based on the AOT algorithm coming out in the last four years.

Larios’ recent research has proposed several modifications to the AOT algorithm. The first was a nonlinear version of the algorithm, which resulted in super-exponential convergence rates. In 2017, Larios was awarded an individual investigator grant by the National Science Foundation to develop nonlinear data assimilation. This research involves tools from functional analysis and topology as well as large-scale tests on the supercomputers at UNL’s Holland Computing Center.

Recently, along with his Ph.D. student Collin Victor, Larios developed an AOT-method for moving measurement devices, allowing measurements to come from sensors attached to satellites, drones, or moving vehicles. He showed that allowing the sensors to move can result in an order of magnitude reduction in the number of sensors required, which could drastically reduce equipment costs for scientists. In addition, along with his Ph.D. student Elizabeth Carlson and Dr. Joshua Hudson at the Johns Hopkins University Applied Physics Laboratory, Larios is developing AOT-based techniques for estimating the underlying parameters of the dynamical system on-the-fly. These latest innovations will result in more robust simulations of turbulence.