Neural Ordinary Differential Equations (Neural ODEs, or NODEs) constitute a key class of models in modern machine learning. Introduced in 2018 as a continuous-time analogue of ResNets, NODEs offer a natural framework for studying deep neural networks through tools from mathematical analysis and control theory. Since their introduction, a wide variety of NODE architectures has been proposed, each exhibiting distinct modeling capabilities and limitations. In this talk, I present a taxonomy of several classical NODE models. Starting from standard (time-dependent) NODEs, I then consider semi-autonomous and fully autonomous architectures. These models are compared from a controllability perspective, highlighting their classification and trajectory-tracking properties.
The talk is based on joint work with Z. Li, K. Liu, and E. Zuazua, as well as ongoing work with A. Álvarez-López.
Lascia un commento