Research

Robust Design Optimization with Scientific Computing

The design of engineering systems is typically dominated by high reliability and performance requirements. Conventional deterministic design approaches incorporate the effect of uncertainty in the design process via safety factors or worst-case scenarios that could result in conservative or insufficient designs. This deficiency can be overcome by incorporating probabilistic methods in the design process as done in the robust design optimization by minimizing the performance variation and reliability based design optimization by constraining the failure probability. However, the main intrinsic challenges are the numerical simulation/modeling, and computational complexity of these approaches when applied to complex physical systems e.g. systems undergoing fluid-structure interaction (FSI) or advanced manufacturing of structures, etc. My research is primarily focused on developing systematic frameworks for design optimization and uncertainty quantification of such complex systems using computational math methods.

I. Engineering Applications

  • Topology Optimization under Manufacturing Uncertainty

Topology optimization has emerged as a powerful tool in designing vast variety of high performance structures from medical implants and prosthetics to jet engine components. While the subject of topology optimization has received considerable attention in the past two decades a majority of existing works assume deterministic parameters and conditions in the optimization process. However, the performance of a structure varies due to the inherent uncertainty in different parameters such as loading, boundary conditions and geometry. I have developed a systematic approach for topology optimization under uncertainty that accounts for spatially varying manufacturing errors. As an example, such errors are manifested in the form of under- and over-etching in the design of MEMS that influence the desired mechanical properties of the system. The approach utilizes a random field in conjunction with efficient non-intrusive Polynomial Chaos Expansion (PCE) to model the spatial variability. The non-intrusive PCE implementation is significantly facilitated via parallel processing due to its embarrassingly parallel nature. The optimized designs obtained from this approach are shown to be more robust and reliable than designs obtained from usual deterministic optimization.


Nominal design of a solid domain with circular holes

Distortion of the hole shapes due to spatially varying manufacturing errors

Optimized designs affected by different realizations of manufacturing uncertainty. The variation in bar thicknesses indicating the geometric imperfection, is apparent.

Mean (left) and standard deviation (right) of element volume fractions. As expected standard deviation is more significant on the boundaries.




Reliability Based Topology Optimization





Robust Topology Optimization


  • Parametric Topology Optimization with Multi-resolution Finite Element Models

Multi-resolution finite element models have been used in a number of studies to enhance the computational efficiency of topology optimization. These multi-resolution topology optimization approaches are explored within a deterministic framework i.e. when the focus is only on a limited number of deterministic simulations throughout different mesh resolutions. In this work we adopt a different perspective in multi-resolution topology optimization and use coarse and fine finite element meshes within a parametric/stochastic framework. We use the inexpensive low-resolution model to traverse the parameter space and use that information to predict the stochastic response and sensitivity of the expensive high-resolution model. In this way the stochastic analysis is primarily performed via a low-resolution model which drastically decreases the computational complexity. Our method is non-intrusive i.e. it is implemented with minimal modification to the existing codes for topology optimization. We present our approach in the context of a generic density based topology optimization however it is similarly applicable to a level-set based method. We also provide error bounds for the bi-fidelity construction of compliance and its sensitivity which serves as a certificate for the convergence of our parametric topology optimization approach. The implementation of our approach which is the extension of well-known “Efficient topology optimization in MATLAB using 88 lines of code” is available upon request.


The left animation uses single resolution 100 by 100 mesh for optimization whereas the right animation computes mostly the coarse 10 by 10 mesh hence is remarkably faster. The single resolution optimization of 10 by 10 mesh is shown on the right figure above which is uninformative by itself but the coarse model significantly enhances the computation when used in the multi-resolution framework.


Parametric multifidelity optimization for designing a heat-sink with temperature source as a random field. Each finite element node is a heat source in this square domain. The FE-solver solves the Laplace equation with low and high resolution meshes. The center of the left edge is a heat sink. Again the design on the left is obtained by solving a limited number of high-resolution and several inexpensive low-resolution simulations which are not informative/usable by themselves as shown on the right image.

  • Stress-based Topology Optimization under Uncertainty with Noisy Gaussian Process

Gaussian Processes are powerful tools for high-dimensional data analysis. We use multi-resolution finite element models within a Gaussian Process (GP) framework to accommodate the high-dimensional parameter space that may arise due to the scatter in material property or geometric imperfections. We minimize the p-norm Von-Mises stress and as such we compute its sensitivity via adjoint analysis which is non-trivial unlike compliance minimization i.e. the adjoint analysis takes a separate high-fidelity FE solve. The GP surrogate accelerates the computation of parametric stress and its sensitivity. GP is also a well-established tool for modeling noise in the observed data. The high-fidelity finite element models are still susceptible to numerical/modeling errors. This type of errors can be effectively modeled as noise in our GP framework to quantify their effect on the final design. This result helps the designer in assessing the allowable tolerance in modeling error such that the noisy simulations still produce the almost identical design to those with ``true'' simulations. We provide a computable error estimate which quantifies the discrepancy between the noisy and true simulations.

Stress-based topology optimization: low-resolution (left), and high-resolution optimization (right) . Note that the lower half bars are thicker which is due to the unsymmetric loading.




Stress-based topology optimization via GP with multi-resolution kernel: optimization with unpolluted simulation data (left) which is almost identical to the high-resolution optimization shown on the top-right figure; optimization with highly noisy simulation data which converges to a different design (right).

  • Shape Optimization under Uncertainty for Rotor Blades of Horizontal Axis Wind Turbines

Renewable energy sources deliver power with minimum impact on the environment. Among all renewable energy sources, wind energy is the fastest growing renewable energy source. This fast growth calls for development of larger scale wind turbines that are prone to more significant mechanical failures. Indeed, wind turbines operate in a hostile environment, and should be designed for uncertain loads due to the nature of the wind. I have developed a design optimization under uncertainty framework which integrates the reduced order turbulent wind model, the aeroelastic model of the wind turbine based on the Blade Element Method (BEM) and the reduced order finite element model of blade with efficient design sensitivity analysis under uncertainty. I used this efficient framework to determine the optimal shape of the blade that maximizes the generated power subject to reliability based and robust design constraints on the allowable compliance. The uncertainty propagation incorporates the Polynomial Chaos Expansion that facilitates the calculation of the output response given the input uncertainties such as wind and material properties.




A mapping from a 3D model of a wind turbine blade to a reduced order finite element model with random parameters under fluctuating wind forces.


Our framework combines a reduced order finite element model, aeroelastic computations and gradient-based optimization which provides a systematic and computationally tractable design platform for such a complex engineering system.

II. Computational Methods Development

  • Gradient-Based Design Optimization under Uncertainty via Stochastic Expansion Methods

There are two levels of variation in optimization under uncertainty problems, namely variations in design parameters as well as variations in uncertain parameters that add to the computational complexity of the problem. The reliability and robustness measures are often estimated via Monte Carlo analysis or Taylor Expansion of the response function. It is obvious that the Monte Carlo approach is computationally prohibitive for large scale problems. The approximations based on Taylor Expansion are computationally efficient, however they are less accurate than estimates based on polynomial chaos expansions. Unfortunately, the polynomial chaos approach involves significant computational costs and incorporating PCE in usual population based optimization algorithms further increases this computational expense. As part of my postdoctoral research, I have focused on efficient computational methods for design optimization under uncertainty with the emphasis on the optimization algorithms. To that end, I have derived design sensitivity expressions with PCE. This work which serves as a systematic methodology for evaluating design sensitivities in the presence of uncertainty, enables the use of efficient gradient-based optimization algorithms for both reliability based design optimization and robust design optimization.





Polynomial Chaos Expansion (PCE) facilitates the characterization of failure region and estimation of failure probability and its sensitivity thus enabling the use of efficient gradient based optimizers for design under uncertainty




PCE with limited number of simulations (e.g. < 100 simulations) predicts the tail of limit state distribution as accurately as a Monte Carlo (MC) analysis with large number of simulations (e.g. 10^5 simulations)

  • Numerical Quadrature in Multiple Dimensions for Design Optimization and Uncertainty Quantification

Building surrogate models requires numerical integration that often involves multiple dimensions. Such integration requires many function evaluations i.e. evaluations on quadrature nodes. Borrowing ideas from structural shape optimization we have developed a novel quadrature rule called designed quadrature that optimizes the position and weight of nodes and therefore yields the same accuracy with far fewer nodes compared to competing methods such as sparse grids. Applying this new rule to a topology optimization under manufacturing uncertainty problem, it is shown that the computational cost is reduced to less than 50%.

We solve nonlinear moment-matching conditions that enforce exact integration on polynomial subspaces with geometric constraints on nodes and weights. We use a penalty method to address the geometric constraints and numerical regularization to address ill-conditioning. Indeed our quadrature method can constrain nodal locations to awkward geometries as shown in the Video below. In this case we design nodes to mimic the ``U'' shape indicating the University of Utah logo. We have also designed quadrature rules up to 100-dimensions (with hyperbolic cross index set) which take less than an hour on a personal desktop.




Ensemble of designed quadrature rules for uniform weight associated with Legendre polynomials in two dimensions d=2 with polynomial degree p=2. Each three-point nodal configuration has nodes connected with blue lines, forming a triangle. Left: Randomly generated initial guesses provided to the algorithm, Right: Converged designed quadrature rules.




Designed Quadrature for uniform weight and d = p = 2 with ``U" shape indicating the University of Utah logo.


  • Convergence Acceleration of Polynomial Chaos Solutions via Sequence Transformation

Significant computational challenges are associated with the polynomial chaos approach for problems with high dimensional parameter spaces. Efficiency and error analysis issues related to polynomial chaos solutions are subjects of much active research. The focus in many of these works has been on adapting the approximating stochastic bases to the problem at hand by relying on various error estimates. I have addressed this challenge by developing a novel methodology for convergence acceleration of polynomial chaos-based stochastic Galerkin solutions that does not require modifications to the approximation basis. Rather, I adapt nonlinear sequence transformations to the pre-computed approximations to increase their accuracy beyond existing methods.



The Shanks (e1 and e1^2) and Levin (v1) sequence transformations accelerate the convergence of polynomial chaos coefficients. For a target relative error the computational effort required for these transformations is less than that of PCE. Similarly for a fixed computational effort these transformations yield smaller relative error.

  • Identification of Discontinuous Nonlinear Systems via a Multivariate Padé Approach

In addition to uncertainty effects, nonlinear effects have to be considered in the characterization of complex physical systems. This is specially true in system identification research for damage detection, condition assessment, control and health monitoring of engineering systems. I have investigated the identification of nonlinear systems that exhibit highly nonlinear or discontinuous response. Such systems preclude the use of standard polynomial representations in response surface methods. To remedy this, I use multi-dimensional Padé-Legendre representations (a rational representation with Legendre polynomials) to better characterize the nonlinear systems. This research has applications in Damage Detection, Condition Assessment and Health Monitoring of Nonlinear Uncertain Dynamic Systems.




The response surface predicted by standard orthogonal polynomials exhibits spurious noises known as Gibbs phenomenon (left), the Padé rational representation mitigates the Gibbs effect in prediction of the same response surface (right).