Calcul différentiel et équations différentielles ordinaires
Cours de calcul différentiel et équations différentielles ordinaires pour la formation ModIA.
Remarque préliminaire. Voir la documentation générale pour récupérer le cours (clonage d'un projet Git), etc.
Cours
Le cours se trouve sur un unique Polycopié.
TP
Nous allons utiliser le langage Julia pour les TPs. Nous allons créer un environnement spécifique à ce cours, pour cela suivez les étapes suivantes :
- Installer la dernière version de Julia en suivant les indications sur la page downloads.
N'installez pas Julia via conda !
-
Le plus simple est d'utiliser VSCode pour faire tourner les TPs.
-
Exécutez les cellules du notebook
tp/install.ipynb
sous VSCode. Il vous faudra peut-être choisir le noyau Julia à l'ouverture du notebook pour pouvoir l'exécuter. Il y a trois cellules qui permettent :- d'activer le projet dans le répertoire courant ;
- d'installer les packages dans le projet courant (cela créer deux fichiers
.toml
) ; - de charger les packages pour vérifier que cela fonctionne.
Pour aller plus loin
Si vous vous demandez à quoi peut servir le calcul de dérivées et les équations différentielles ordinaires dans une formation sur l'apprentissage automatique :
- The Elements of Differentiable Programming de Mathieu Blondel et Vincent Roulet (Google). Pour les EDO, cela se passe au chapitre 12.6.
Artificial intelligence has recently experienced remarkable advances, fueled by large models, vast datasets, accelerated hardware, and, last but not least, the transformative power of differentiable programming. This new programming paradigm enables end-to-end differentiation of complex computer programs (including those with control flows and data structures), making gradient-based optimization of program parameters possible. As an emerging paradigm, differentiable programming builds upon several areas of computer science and applied mathematics, including automatic differentiation, graphical models, optimization and statistics. This book presents a comprehensive review of the fundamental concepts useful for differentiable programming. We adopt two main perspectives, that of optimization and that of probability, with clear analogies between the two. Differentiable programming is not merely the differentiation of programs, but also the thoughtful design of programs intended for differentiation. By making programs differentiable, we inherently introduce probability distributions over their execution, providing a means to quantify the uncertainty associated with program outputs.
- Generalizing Scientific Machine Learning and Differentiable Simulation Beyond Continuous models de Christopher Rackauckas (MIT).
The combination of scientific models into deep learning structures, commonly referred to as scientific machine learning (SciML), has made great strides in the last few years in incorporating models such as ODEs and PDEs into deep learning through differentiable simulation. However, the vast space of scientific simulation also includes models like jump diffusions, agent-based models, and more. Is SciML constrained to the simple continuous cases or is there a way to generalize to more advanced model forms? This talk will dive into the mathematical aspects of generalizing differentiable simulation to discuss cases like chaotic simulations, differentiating stochastic simulations like particle filters and agent-based models, and solving inverse problems of Bayesian inverse problems (i.e. differentiation of Markov Chain Monte Carlo methods). We will then discuss the evolving numerical stability issues, implementation issues, and other interesting mathematical tidbits that are coming to light as these differentiable programming capabilities are being adopted.