6 EC
Semester 1, period 1, 2
5334UQDA6Y
Mathematical models in the form of differential equations (PDEs or ODEs) arise in many fields of science and engineering, from physics to climate science and from aerospace engineering to epidemiology. In many cases they must be solved numerically. The development of efficient and accurate algorithms to simulate (i.e., solve) these systems is a classical topic in numerical analysis. A much more recent topic is to investigate the impact of uncertainties in e.g. model parameters, initial conditions and boundary conditions. Even with very accurate numerical algorithms, such uncertainties can have major impact and lead to large uncertainties in numerical results.
In this course we will look at modern mathematical techniques that have been developed to deal with uncertainties in models. In Uncertainty Quantification (UQ), a prototypical problem is to characterize the probability distribution of a model output variable given the distribution of model input parameters, and to do so in an efficient way. Other typical questions are how to infer the probability distribution of an input parameter from observations (“inverse UQ”), and how to assess which model parameters induce the most uncertainty in the model output (sensitivity analysis). UQ is an exciting and very active area of research, involving elements from numerical analysis, differential equations, probability theory, statistics, and more.
The focus of this course will be on the mathematical theory and techniques developed for UQ. We will briefly review some topics from numerical analysis (orthogonal polynomials, approximation and interpolation, quadrature). These provide a basis for several main UQ methods that we will cover: stochastic Galerkin, polynomial chaos expansion (PCE), stochastic collocation. Furthermore, we will discuss Bayesian calibration (typically using Markov chain Monte Carlo, MCMC) for "inverse UQ" problems, and surrogate modeling with Gaussian Processes (GPs). As a final topic, we will look at techniques for sensitivity analysis (especially, Sobol indices).
D. Xiu, Numerical Methods for Stochastic Computations. A Spectral Method Approach. Princeton University Press, 2010.
This is the primary book for the course. We cover chapters 3-8.
R.C. Smith, Uncertainty Quantification. Theory, Implementation, and Applications. SIAM, 2014.
We cover a few sections (mostly from chapters 8 and 15) on Bayesian calibration and on sensitivity analysis. They will be made available via Canvas.
C.E. Rasmussen and C.K.I. Williams, Gaussian Processes for Machine Learning. The MIT Press, 2006.
We cover sections 2.1-2.3. The book is available online via http://www.gaussianprocess.org/gpml/
Lecture slides
Students work in small teams on the weekly homework exercises. Part of the exercises count towards the final grade, for these a brief report must be handed in (one per team). Each team is expected to give twice a short joint presentation on its results for an exercise.
|
Activity |
Hours |
|
|
Course meetings |
28 |
|
|
Self study and exercises |
140 |
|
|
Total |
168 |
(6 EC x 28 uur) |
This programme does not have requirements concerning attendance (TER-B).
| Item and weight | Details |
|
Final grade | |
|
0.25 (25%) Average of 4 graded homework exercises | |
|
0.75 (75%) Final exam assignment + discussion (oral) | |
|
Presentations | Must be ≥ pass |
The final exam consists of a take-home exam assignment for which students hand in a report that will be assessed and discussed individually in an oral exam. The final grade for the course is determined for 75% by the final exam grade and for 25% by the average of 4 submitted homework exercises. The final exam grade must be at least 5.0. If the exam grade is below 5.0, the final grade for the course is the exam grade (thus, a failure grade).
For the homework exercises, a single grade per team is given. Each team gives a (joint) presentation twice, this is assessed with a pass/fail grade for the team. Without a pass grade for the two presentations, no final grade for the course will be given.
In case a resit of the final exam is needed, the grades for the homework exercises and the presentations will still count towards the final grade, in the same manner as described above (i.e., 75% resit final exam grade, 25% homework, pass grade for presentations, provided the exam grade is at least 5.0). A resit will be a take-home exam assignment with discussion of the submitted report, similar to the regular final exam. No resit is possible for the homework exercises or the presentations.
The 'Regulations governing fraud and plagiarism for UvA students' applies to this course. This will be monitored carefully. Upon suspicion of fraud or plagiarism the Examinations Board of the programme will be informed. For the 'Regulations governing fraud and plagiarism for UvA students' see: www.student.uva.nl
Tentative course schedule:
|
Week |
Date |
Content |
Literature |
|
1 |
5 Sept |
Introduction |
|
|
2 |
12 Sept |
Orthogonal polynomials. Approximation and interpolation with polynomials. |
Xiu, ch. 3 |
|
3 |
19 Sept |
Representing input uncertainty. Multivariate Gaussian distributions. Karhunen-Loève expansion. |
Xiu, ch. 4 |
|
4 |
26 Sept |
ODEs and PDEs with random parameters. Generalized Polynomial Chaos expansion. |
Xiu, ch. 5 |
|
5 |
3 Oct |
Multivariate gPC. Stochastic Galerkin method. |
Xiu, ch. 5-6 |
|
6 |
13 Oct |
Stochastic Galerkin method. Multivariate Stochastic Galerkin |
Xiu, ch. 6 |
|
7 |
17 Oct |
Stochastic Collocation method. |
Xiu, ch. 7 |
|
Break |
|
|
|
|
8 |
3 Nov |
Multivariate Stochastic Collocation. Non-Intrusive Spectral Projection. |
Xiu, ch. 7 |
|
9 |
10 Nov |
Inverse UQ: Bayesian calibration and MCMC. |
Xiu, 8.2; Smith, 8.1-8.3 |
|
10 |
17 Nov |
Inverse UQ: Bayesian calibration and MCMC. |
Xiu, 8.2; Smith, 8.1-8.3 |
|
11 |
24 Nov |
Surrogate modeling. Gaussian Processes (GPs). |
Rasmussen & Williams, 2.1-2.3 |
|
12 |
1 Dec |
GPs, continued. Sensitivity Analysis: local and global SA. ANOVA. |
Smith, 13.5, 15.1 |
|
13 |
8 Dec |
Sensitivity Analysis: Sobol indices. Saltelli algorithm. |
Smith, 13.5, 15.1 |
|
14 |
15 Dec |
Sensitivity Analysis: SA with surrogates. |
Smith, 13.5, 15.1 |
The schedule for this course is published on DataNose.