Advances in computational mathematics

Vol. 45, num. 5–6, p. 2969-3019

DOI: 10.1007/s10444-019-09725-6

Date of publication: 2019-12-01

Abstract:

We propose a probabilistic way for reducing the cost of classical projection-based model order reduction methods for parameter-dependent linear equations. A reduced order model is here approximated from its random sketch, which is a set of low-dimensional random projections of the reduced approximation space and the spaces of associated residuals. This approach exploits the fact that the residuals associated with approximations in low-dimensional spaces are also contained in low-dimensional spaces. We provide conditions on the dimension of the random sketch for the resulting reduced order model to be quasi-optimal with high probability. Our approach can be used for reducing both complexity and memory requirements. The provided algorithms are well suited for any modern computational environment. Major operations, except solving linear systems of equations, are embarrassingly parallel. Our version of proper orthogonal decomposition can be computed on multiple workstations with a communication cost independent of the dimension of the full order model. The reduced order model can even be constructed in a so-called streaming environment, i.e., under extreme memory constraints. In addition, we provide an efficient way for estimating the error of the reduced order model, which is not only more efficient than the classical approach but is also less sensitive to round-off errors. Finally, the methodology is validated on benchmark problems.]]>

Abstract:

Solutions to high-dimensional parameter-dependent problems are in great demand in the contemporary applied science and engineering. The standard approximation methods for parametric equations can require computational resources that are exponential in the dimension of the parameter space, which is typically referred to as the curse of dimensionality. To break the curse of dimensionality one has to appeal to nonlinear methods that exploit the structure of the solution map, such as projection-based model order reduction methods. This thesis proposes novel methods based on randomized linear algebra to enhance the efficiency and stability of projection-based model order reduction methods for solving parameter-dependent equations. Our methodology relies on random projections (or random sketching). Instead of operating with high-dimensional vectors we first efficiently project them into a low-dimensional space. The reduced model is then efficiently and numerically stably constructed from the projections of the reduced approximation space and the spaces of associated residuals. Our approach allows drastic computational savings in basically any modern computational architecture. For instance, it can reduce the number of flops and memory consumption and improve the efficiency of the data flow (characterized by scalability or communication costs). It can be employed for improving the efficiency and numerical stability of classical Galerkin and minimal residual methods. It can also be used for the efficient estimation of the error, and post-processing of the solution of the reduced order model. Furthermore, random sketching makes computationally feasible a dictionary-based approximation method, where for each parameter value the solution is approximated in a subspace with a basis selected from a dictionary of vectors. We also address the efficient construction (using random sketching) of parameter-dependent preconditioners that can be used to improve the quality of Galerkin projections or for effective error certification for problems with ill-conditioned operators. For all proposed methods we provide precise conditions on the random sketch to guarantee accurate and stable estimations with a user-specified probability of success. A priori estimates to determine the sizes of the random matrices are provided as well as a more effective adaptive procedure based on a posteriori estimates.]]>