A fundamental theme throughout this blog series has been partial differential equations (PDEs). The reason for this is that PDEs are a language through which we can describe the world around us, from the smallest scales, to the very largest. Many PDEs, moreover, involve time derivatives. This makes sense, as phenomena such as fluid flows, beam vibrations, and gravitational waves change with time. In engineering, we call physical systems that change with time dynamical systems. Oftentimes, the governing PDEs for these systems are very complex. Think of a turbulent flow of high-speed air passing through a jet engine. With modern computer architecture, it is nearly impossible to model this flow with the full Navier-Stokes equations. Therefore, there is a strong interest within the engineering community to derive reduced-order models (ROMs) that are interpretable, generalizable, and capture the dominant features of complicated governing equations. These ROMs come in the form of a system of ordinary differential equations (ODEs) that are formulated in terms of abstract variables that are derived from physically measurable variables.
The goal of this blog post will be to explain one very promising method for finding ROMs for dynamical systems that are essentially nonlinear. According to a 2022 publication from ETH Zurich, an essentially nonlinear, or nonlinearizable dynamical system is one whose governing equations are nonlinear, and cannot be made linear using any coordinate transformation. Many equations, indeed, fall into this category because they have two or more isolated, coexisting stationary states (where a stationary state could be a fixed point, limit cycle, quasiperiodic torus, or chaotic attractor). The main idea of the ROM method that we will discuss today is to (i) linearize a dynamical system around a hyperbolic fixed point and compute the eigenspaces of the linearized operator, (ii) find a uniquely smooth nonlinear extension of the slowest decaying linear eigenspaces (this is called a spectral submanifold (SSM)), (iii) embed the SSM in a space of observables, (iv) find a system of polynomial ODEs that describe the reduced dynamics on the embedded SSM, and (v) use the derived system of ODEs to predict natural system behavior and response to forcing.
So let U be an open subset of ℝn, and consider a dynamical system described by the ODE
dx/dt = Ax + f0(x) + ε f1(x, Ωt; ε),
where ε > 0 is a small parameter, A∈ ℝn x n, and f0 = O(|x|2). Moreover, the nonlinear functions f0: U → ℝn and f1: U x 𝕋 l → ℝn are both r-times continuously differentiable, and f1 is quasiperiodic in time. Notationally, 𝕋 l is used to denote the l-dimensional torus.
Since ε is small, the time-dependent forcing term is small, which means that the time-independent part of the equation is the most important part of our governing equation. Therefore, we can analyze our equation with ε = 0, and still reasonably expect our results to apply to cases where ε > 0.
If ε = 0, and x = 0 is a fixed point, our dynamics in a small neighborhood of the origin are locally described by the linear system
dx/dt = Ax.
When we have such a setup, we can glean the stability of our system by computing the spectrum (eigenvalues) of A. If A has n complex eigenvalues {λj: j = 1, 2, …, n}, then we can order these eigenvalues according to the size of their real parts. In this scenario, we have
Re λn ≤ Re λn-1 ≤… ≤ Re λ1.
Assume here that all eigenvalues have nonzero real part (ie. x = 0 is a hyperbolic fixed point). With this assumption, the Hartman-Grobman Theorem then guarantees that the linear operator A faithfully approximates the full nonlinear dynamical system. For each j, also let vj be the eigenvector (or generalized eigenvector) with eigenvalue λj, and set
Ej = span(vj).
Next, define a spectral subspace of ℝn to be the direct sum
Ej1,…, jq = Ej1 ⊕ … ⊕ Ejq.
These subspaces are called invariant subspaces because any trajectories that begin on a spectral subspace remain on the spectral subspace. Now put for k = 1, 2, …, n,
Ek = E1,…, k.
Then, E1 ⊂ E2 ⊂… ⊂ En is a chain of nested subspaces that include dynamics that decay increasingly quickly. Assuming that all eigenvalues have negative real parts, E1 is the spectral subspace where trajectories decay the most slowly, E2 is the spectral subspace with the second-slowest decaying trajectories, and so on. If our system were linear, then all of this would be magnificent. We could accurately predict the behavior of
dx/dt = Ax.
by projecting dx/dt = Ax onto the slowest-decaying eigenvectors. This would yield a formidable reduced-order model. However, because our system is nonlinear, we have to be more careful, and find a method to extract a ROM that “respects” the nonlinearities.
Pick a spectral subspace E = Ej1,…, jq. Assuming that E is “nice enough,” one can show that the flat subspace E has infinitely many curved continuations on which trajectories of the full nonlinear equation evolve. All of these manifolds are tangent to E at the fixed point x = 0, have a dimension equal to that of E, and have the same quasiperiodic structure as f1. Furthermore, there is one of these manifolds that is smoother than any of the others. We will call this manifold the spectral submanifold (SSM) of E, and denote it by W(E, Ωt; ε). If f0 and f1 are analytic, then W(E, Ωt; ε) is analytic, meaning that we can approximate our SSM by a polynomial. This fact has practical utility, as it allows computers to construct SSMs using basic polynomial building block functions.
While all of this is groovy, we might not have access to the full n-dimensional state space in which the SSM lives. Therefore, we must embed the SSM in a coordinate system of observables that we can measure with laboratory equipment. There are two strategies for such an embedding. The first makes use of the Whitney embedding theorem, which states that almost all smooth observable vectors y(x) ∈ ℝp provide an embedding of a compact subset C of the d-dimensional spectral submanifold W(E, Ωt; ε) into the observable space if p > 2(d + l). Recall here that l was the dimension of the torus describing the quasiperiodic behavior of our forcing function f1. However, there are many cases where we might not have access to 2(d + l) independent observables. In these cases, we leverage Takens delay embedding theorem, which states that if s(t) is a scalar quantity measured periodically at interval ∆t, then the observable vector
y(t) = ( s(t), s(t + ∆t), …, s(t + (p-1)∆t) ) ∈ ℝp
embeds W(E, Ωt; ε) into ℝp.
So suppose now that we have successfully embedded our SSM into ℝp, and have denoted this embedded manifold by M0. Now, we want to use experimental laboratory data to learn the reduced-order dynamics along M0. Specifically, we want to approximate the dynamics on our SSM by an extended normal form, which is a system of polynomial ODEs that captures the dynamics in a sufficiently large region around the fixed point x = 0. Through a least-squares minimization procedure, we can successfully obtain the best-fit normal form involving polynomials of a user-specified maximum degree. Such a normal form might look something like
dρ/dt = α0ρ + βρ3
dθ/dt = ω0 + γρ2.
Such equations can, at last, form the ROM that we are looking for. These reduced order-models are powerful for a number of reasons. For example, the embedded SSM from the unforced equation with ε = 0 lies close to the SSM from the forced equation with ε > 0. In this sense, our model derived from unforced data can predict forced responses. These predictions, in turn, can be used to design controllers that suppress disturbances in a system.
While the applications of reduced-order models are plentiful, my personal interest lies in fluid dynamics and flow control. I am currently getting started on a project that uses SSMs to derive a reduced-order model describing the fluidic pinball problem. In this setting, we have a uniform 2D flow of horizontal velocity U∞ that impinges upon three equally-spaced cylinders. Depending on the Reynolds number, there is a range of different oscillatory wake behavior that arises. My ultimate goal will be to leverage an ROM for this phenomenon to design a flow control strategy that makes disturbed and asymmetric flows return to their steady-state, symmetric behaviors. In future blog posts, I am excited to explore this project in greater detail. For now, I wish all of my readers a wonderful fall break. Please take care.
Leave a Reply