(Joint work with Guillaume Bal)
In a well-known paradigm of long-distance wave propagation in random media, the wave field is described as a random superposition of plane waves with Gaussian distributed amplitudes and uniformly distributed phases. Such a wave field is then circularly symmetric complex normal (Gaussian),with independent zero-mean normal real and imaginary parts.
Evolution of a Gaussian beam through simulated turbulence
As a consequence, the wave energy density is distributed according to an exponential law and the scintillation index, defined as a the normalized variance of the energy density with respect to its squared mean, is unity. Wave fields at nearby points are also often observed to be essentially independent. This regime provides a model for the speckle formation observed in many experiments of classical wave propagation in heterogeneous media. While fairly intuitive and well accepted in the physical literature, this conjecture is not entirely supported by any mathematical derivation. The main objective of the following paper is to provide a complete derivation in the diffusive regime of the scintillation scaling for an Itô-Schrödinger paraxial wave model.
Complex Gaussianity of long-distance random wave processes (2024) (arXiv)
A major technical advantage of the Itô-Schrödinger model is the availability of closed-form partial differential equations for the statistical moments of the random wave process. The main technical component of our derivation revisits these partial differential equations for arbitrary- order statistical moments. In particular, we show that in Fourier variables, the solutions to these equations with arbitrary initial measures with bounded variation can be written, up to a negligible component, as a functional of first and second statistical moments as for complex Gaussian variables. This control allows us to pass to the diffusive (long-propagation distance) regime and after inverse Fourier transformation, to obtain error estimates in the physical variables in the uniform sense. These elements allow us to show convergence of finite dimensional distributions of the random wave process to a circular Gaussian limit. We also obtain stochastic continuity and tightness results showing that the random wave process converges in distribution to its limit as distributions over spaces of Hölder-continuous functions.
Limiting behaviour of time averaged scintillation in the diffusive regime. Increasing time averaging and reducing coherence length both clearly favour a reduction in scintillation.
Inverse problem theory is often studied in the ideal infinite-dimensional setting. Through the lens of the PDE-constrained optimization, the well-posedness PDE theory suggests unique reconstruction of the parameter function that attain the zero-loss property of the mismatch function, when infinite amount of data is provided. Unfortunately, this is not the case in practice, when we are limited to finite amount of measurements due to experimental or economical reasons. Consequently, one must compromise the inference goal to a discrete approximation of the unknown smooth function.
What is the reconstruction power of a fixed number of data observations? How many parameters can one reconstruct? Here we describe a probabilistic approach, and spell out the interplay of the observation size (r) and the number of parameters to be uniquely identified (m). The technical pillar is the random sketching strategy, in which the matrix concentration inequality and sampling theory are largely employed. By analyzing randomly sub-sampled Hessian matrix, we attain well-conditioned reconstruction problem with high probability. Our main theory is finally validated in numerical experiments. We set tests on both synthetic and the data from an elliptic inverse problem. The empirical performance shows that given suitable sampling quality, the well-conditioning of the sketched Hessian is certified with high probability.
(Joint work with Ruhui Jin, Qin Li and Sam Stechmann)
Unique reconstruction for discretized inverse problems: a random sketching approach (2024) (arXiv)
The ab initio model for heat propagation is the phonon transport equation, a Boltzmann-like kinetic equation. When two materials are put side by side, the heat that propagates from one material to the other experiences thermal boundary resistance. Mathematically, it is represented by the reflection coefficient of the phonon transport equation on the interface of the two materials. This coefficient takes different values at different phonon frequencies, between different materials. In experiments scientists measure the surface temperature of one material to infer the reflection coefficient as a function of phonon frequency. In this article, we formulate this inverse problem in an optimization framework and apply the stochastic gradient descent (SGD) method for finding the optimal solution. We furthermore prove the maximum principle and show the Lipschitz continuity of the Fréchet derivative. These properties allow us to justify the application of SGD in this setup.
(Joint work with Irene Gamba and Qin Li)
Reconstructing the thermal phonon transmission coefficient at solid interfaces in the phonon transport equation (SIAM J. Appl. Math., 82(1), 194–220 (2022)) (Journal) (arXiv)
The classical description for heat conductance is the simple heat equation, a parabolic equation when time is present, or an elliptic type in the steady state. The derivation of the heat equation is based on Fourier law that states the heat flux proportionally depends on the temperature gradient, so the bigger temperature fluctuation leads to stronger heat flux. This law is an observational fact but is not derived from the first principle. In modern physics, derived from the first principle, it was discovered that this Fourier law may not be accurate. The underlying physics model for heat propagation should be characterized by the phonon transport equation. It describes the dynamics of phonons, the microscopic quanta that propagates heat energy.
This phonon transport equation, builds upon the Wigner transform from quantum mechanics, and hence first principle, is a mesoscopic description that follows the statistical mechanics derivation. Based on this phonon transport equation, at the correct scaling, the heat equation is then rediscovered as the associated macroscopic limiting system when the system is “large” enough so that the mesoscopic fluctuation can be ignored. This partially justifies the validity of the heat equation that has been traditionally used as a model equation. We study this asymptotic relation between the phonon transport equation and the diffusion heat equation, and we pay special attention to the boundary effect. When no physical boundary is present, the derivation from one equation to the other is a rather straightforward process. When physical boundary is present, however, boundary layers emerge adjacent to the boundaries. They are used to damp the fluctuations at the mesoscopic level close to the physical boundaries that are inconsistent with the limiting equation. At the macroscopic limit, this means the boundary condition for the limiting diffusion equation needs to be fine-tuned to reflect such perturbation. We derive the limiting equation that achieves the second order convergence, and provide a numerical recipe for computing the Robin coefficients.
(Joint work with Qin Li and Weiran Sun)
In free-space optical communications and other applications, it is desirable to design optical beams that have reduced or even minimal scintillation. However, the optimization problem for minimizing scintillation is challenging, and few optimal solutions have been found. Here we investigate the general optimization problem of minimizing scintillation and formulate it as a convex optimization problem. We find that the scintillation-minimizing beam is incoherent light and has low intensity at the receiver. We find this result in both analytical solutions and numerical solutions, including cases where the mutual intensity function 𝐽 is non-parameterized and is allowed to be general. A modified objective function is introduced to balance the scintillation and the intensity. This optimization problem is convex. Utilizing machine learning algorithms (especially the randomized SVD solver) that exploit low-rank features, we can reduce both memory and computational cost and find the optimal mutual intensity function.
(Joint work with Qin Li and Sam Stechmann)
Scintillation minimization vs intensity maximization in optimal beams (Optics Letters 48 (15), 3865-3868 (2023)) (Journal) (Supplement) (arXiv) (Code)
In applications such as free-space optical communication, a signal is often recovered after propagation through a turbulent medium. In this setting, it is common to assume that limited information is known about the turbulent medium, such as a space- and time-averaged statistic (e.g., root-mean-square), but without information about the state of the spatial variations. It could be helpful to gain more information if the state of the turbulent medium can be characterized with the spatial variations and evolution in time described. Here, we propose to investigate the use of data assimilation techniques for this purpose.
A computational setting is used with the paraxial wave equation, and the extended Kalman filter is used to conduct data assimilation using intensity measurements. To reduce computational cost, the evolution of the turbulent medium is modeled as a stochastic process. Following some past studies, the process has only a small number of Fourier wavelengths for spatial variations. The results show that the spatial and temporal variations of the medium are recovered accurately in many cases. In some time windows in some cases, the error is larger for the recovery.
(Joint work with Qin Li and Sam Stechmann)