We construct and discuss a functional equation with contraction property. The solutions are real univariate polynomials. The series solving the natural fixed point iterations have immediate interpretation in terms of Neural Networks with recursive properties and controlled accuracy.
Révisé le :
Accepté le :
Publié le :
@article{CRMATH_2020__358_9-10_1059_0, author = {Despr\'es, Bruno and Ancellin, Matthieu}, title = {A functional equation with polynomial solutions and application to {Neural} {Networks}}, journal = {Comptes Rendus. Math\'ematique}, pages = {1059--1072}, publisher = {Acad\'emie des sciences, Paris}, volume = {358}, number = {9-10}, year = {2020}, doi = {10.5802/crmath.124}, language = {en}, url = {http://www.numdam.org/articles/10.5802/crmath.124/} }
TY - JOUR AU - Després, Bruno AU - Ancellin, Matthieu TI - A functional equation with polynomial solutions and application to Neural Networks JO - Comptes Rendus. Mathématique PY - 2020 SP - 1059 EP - 1072 VL - 358 IS - 9-10 PB - Académie des sciences, Paris UR - http://www.numdam.org/articles/10.5802/crmath.124/ DO - 10.5802/crmath.124 LA - en ID - CRMATH_2020__358_9-10_1059_0 ER -
%0 Journal Article %A Després, Bruno %A Ancellin, Matthieu %T A functional equation with polynomial solutions and application to Neural Networks %J Comptes Rendus. Mathématique %D 2020 %P 1059-1072 %V 358 %N 9-10 %I Académie des sciences, Paris %U http://www.numdam.org/articles/10.5802/crmath.124/ %R 10.5802/crmath.124 %G en %F CRMATH_2020__358_9-10_1059_0
Després, Bruno; Ancellin, Matthieu. A functional equation with polynomial solutions and application to Neural Networks. Comptes Rendus. Mathématique, Tome 358 (2020) no. 9-10, pp. 1059-1072. doi : 10.5802/crmath.124. http://www.numdam.org/articles/10.5802/crmath.124/
[1] Machine Learning and Control Theory (https://arxiv.org/abs/2006.05604)
[2] A Guide to Recurrent Neural Networks and Backpropagation, 2001 (published in Dallas project, SICS technical report)
[3] Linear and nonlinear functional analysis with applications, Other Titles in Applied Mathematics, 130, Society for Industrial and Applied Mathematics, (SIAM), 2013 | Zbl
[4] Approximation by superpositions of a sigmoidal function, Math. Control Signals Systems, Volume 2 (1989) no. 4, pp. 303-314 | DOI | MR | Zbl
[5] Nonlinear Approximation and (Deep) ReLU Networks (https://arxiv.org/abs/1905.02199v1)
[6] Machine Learning, adaptive numerical approximation and VOF methods, 2020 (colloquium LJLL/Sorbonne university, https://www.youtube.com/watch?v=OPKFYe01hH4)
[7] Machine learning design of volume of fluid schemes for compressible flows, J. Comput. Phys., Volume 408 (2020), 109275 | DOI | MR
[8] Deep Learning, Adaptive Computation and Machine Learning, MIT Press, 2016 | Zbl
[9] Weierstrass’s function and chaos, Hokkaido Math. J., Volume 12 (1983) no. 3, pp. 333-342 | MR | Zbl
[10] The Takagi Function and Its Generalization, Japan J. Appl. Math., Volume 1 (1984) no. 1, pp. 183-199 | DOI | MR | Zbl
[11] ReLU Deep Neural Networks and Linear Finite Elements, J. Comput. Math., Volume 38 (2020) no. 3, pp. 502-527
[12] Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units, Commun. Comput. Phys., Volume 27 (2019) no. 2, pp. 379-411 | MR
[13] Deep Network Approximation for Smooth Functions, 2020 (https://blog.nus.edu.sg/matzuows/publications/)
[14] Optim: A mathematical optimization package for Julia, J. Open Source Softw., Volume 3 (2018) no. 24, 615 | DOI
[15] Deep ReLU networks and high-order finite element methods, Anal. Appl. (Singap.), Volume 18 (2020) no. 5, pp. 715-770 | DOI | MR | Zbl
[16] Forward-Mode Automatic Differentiation in Julia (2016) (https://arxiv.org/abs/1607.07892)
[17] Error bounds for approximations with deep ReLU networks, Neural Netw., Volume 97 (2017), pp. 103-114 | DOI | Zbl
Cité par Sources :