Reduced bases have been introduced for the approximation of parametrized PDEs in applications where many online queries are required. Their numerical efficiency for such problems has been theoretically confirmed in Binev et al. (SIAM J. Math. Anal. 43 (2011) 1457–1472) and DeVore et al. (Constructive Approximation 37 (2013) 455–466), where it is shown that the reduced basis space V$$ of dimension n, constructed by a certain greedy strategy, has approximation error similar to that of the optimal space associated to the Kolmogorov n-width of the solution manifold. The greedy construction of the reduced basis space is performed in an offline stage which requires at each step a maximization of the current error over the parameter space. For the purpose of numerical computation, this maximization is performed over a finite training set obtained through a discretization of the parameter domain. To guarantee a final approximation error ε for the space generated by the greedy algorithm requires in principle that the snapshots associated to this training set constitute an approximation net for the solution manifold with accuracy of order ε. Hence, the size of the training set is the ε covering number for M and this covering number typically behaves like exp(Cε−1/s) for some C > 0 when the solution manifold has n-width decay O(n−s). Thus, the shear size of the training set prohibits implementation of the algorithm when ε is small. The main result of this paper shows that, if one is willing to accept results which hold with high probability, rather than with certainty, then for a large class of relevant problems one may replace the fine discretization by a random training set of size polynomial in ε−1. Our proof of this fact is established by using inverse inequalities for polynomials in high dimensions.
Accepté le :
Publié le :
DOI : 10.1051/m2an/2020004
Mots-clés : Reduced bases, performance bounds, random sampling, entropy numbers, Kolmogorov $$-widths, sparse high-dimensional polynomial approximation
@article{M2AN_2020__54_5_1509_0, author = {Cohen, Albert and Dahmen, Wolfgang and DeVore, Ronald and Nichols, James}, title = {Reduced {Basis} {Greedy} {Selection} {Using} {Random} {Training} {Sets}}, journal = {ESAIM: Mathematical Modelling and Numerical Analysis }, pages = {1509--1524}, publisher = {EDP-Sciences}, volume = {54}, number = {5}, year = {2020}, doi = {10.1051/m2an/2020004}, mrnumber = {4123669}, zbl = {1444.62113}, language = {en}, url = {http://www.numdam.org/articles/10.1051/m2an/2020004/} }
TY - JOUR AU - Cohen, Albert AU - Dahmen, Wolfgang AU - DeVore, Ronald AU - Nichols, James TI - Reduced Basis Greedy Selection Using Random Training Sets JO - ESAIM: Mathematical Modelling and Numerical Analysis PY - 2020 SP - 1509 EP - 1524 VL - 54 IS - 5 PB - EDP-Sciences UR - http://www.numdam.org/articles/10.1051/m2an/2020004/ DO - 10.1051/m2an/2020004 LA - en ID - M2AN_2020__54_5_1509_0 ER -
%0 Journal Article %A Cohen, Albert %A Dahmen, Wolfgang %A DeVore, Ronald %A Nichols, James %T Reduced Basis Greedy Selection Using Random Training Sets %J ESAIM: Mathematical Modelling and Numerical Analysis %D 2020 %P 1509-1524 %V 54 %N 5 %I EDP-Sciences %U http://www.numdam.org/articles/10.1051/m2an/2020004/ %R 10.1051/m2an/2020004 %G en %F M2AN_2020__54_5_1509_0
Cohen, Albert; Dahmen, Wolfgang; DeVore, Ronald; Nichols, James. Reduced Basis Greedy Selection Using Random Training Sets. ESAIM: Mathematical Modelling and Numerical Analysis , Tome 54 (2020) no. 5, pp. 1509-1524. doi : 10.1051/m2an/2020004. http://www.numdam.org/articles/10.1051/m2an/2020004/
Sparse polynomial approximation of parametric elliptic PDEs. Part I: affine coefficients. ESAIM: M2AN 51 (2017) 321–339. | DOI | Numdam | MR | Zbl
, and ,Convergence rates for greedy algorithms in reduced basis methods. SIAM J. Math. Anal. 43 (2011) 1457–1472. | DOI | MR | Zbl
, , , , and ,A priori convergence of the Greedy algorithm for the parameterized reduced basis. ESAIM: M2AN 46 (2012) 595–603. | DOI | Numdam | MR | Zbl
, , , , ,Discrete least squares polynomial approximation with random evaluations – application to parametric and stochastic elliptic PDEs. ESAIM: M2AN 49 (2015) 815–837. | DOI | Numdam | MR | Zbl
, , , and ,Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs. J. Math. Pures Appl. 103 (2015) 400–428. | DOI | MR | Zbl
, and ,Approximation of high-dimensional PDEs. Acta Numer. 24 (2015) 1–159. | DOI | MR | Zbl
and ,Analytic regularity and polynomial approximation of parametric and stochastic PDEs. Anal. Appl. 9 (2011) 11–47. | DOI | MR | Zbl
, and ,Multivariate approximation in downward closed polynomial spaces. In: Contemporary Computational Mathematics – A Celebration of the 80th Birthday of Ian Sloan. Springer, Cham (2018) 233–282. | DOI | MR | Zbl
and ,Greedy algorithms for reduced bases in Banach spaces. Constr. Approx. 37 (2013) 455–466. | DOI | MR | Zbl
, and ,A priori convergence theory for reduced-basis approximations of single-parametric elliptic partial differential equations. J. Sci. Comput. 17 (2002) 437–446. | DOI | MR | Zbl
, and ,Global a priori convergence theory for reduced-basis approximations of single-parameter symmetric coercive elliptic partial differential equations. C. R. Acad. Sci. Paris Sér. I Math. 335 (2002) 289–294. | DOI | MR | Zbl
, and ,A posteriori error bounds for reduced-basis approximation of parametrized noncoercive and nonlinear elliptic partial differential equations, In: Proc. 16th AIAA Computational Fluid Dynamics Conference (2003).
, , and ,The Volume of Convex Bodies and Banach Spaces Geometry. Cambridge University Press (1989). | DOI | MR | Zbl
,Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations application to transport and continuum mechanics. Arch. Comput. Methods Eng. 15 (2008) 229–275. | DOI | MR | Zbl
, and ,Reduced-basis approximation and a posteriori error estimation for many-parameter heat conduction problems. Numer. Heat Transfer B-Fund 54 (2008) 369–389. | DOI
,The Marcinkiewicz-type discretization theorems. Constr. Approx. 48 (2018) 337–369. | DOI | MR | Zbl
,Cité par Sources :