Course manual 2023/2024

Course content

Mathematical models in the form of differential equations (PDEs or ODEs) arise in many fields of science and engineering, from physics to climate science and from aerospace engineering to epidemiology. In many cases they must be solved numerically. The development of efficient and accurate algorithms to simulate (i.e., solve) these systems is a classical topic in numerical analysis. A much more recent topic is to investigate the impact of uncertainties in e.g. model parameters, initial conditions and boundary conditions. Even with very accurate numerical algorithms, such uncertainties can have major impact and lead to large uncertainties in numerical results.

 

In this course we will look at modern mathematical techniques that have been developed to deal with uncertainties in models. In Uncertainty Quantification (UQ), a prototypical problem is to characterize the probability distribution of a model output variable given the distribution of model input parameters, and to do so in an efficient way. Other typical questions are how to infer the probability distribution of an input parameter from observations (“inverse UQ”), and how to assess which model parameters induce the most uncertainty in the model output (sensitivity analysis). UQ is an exciting and very active area of research, involving elements from numerical analysis, differential equations, probability theory, statistics, and more.

 

The focus of this course will be on the mathematical theory and techniques developed for UQ. We will briefly review some topics from numerical analysis (orthogonal polynomials, approximation and interpolation, quadrature). These provide a basis for several main UQ methods that we will cover: stochastic Galerkin, polynomial chaos expansion (PCE), stochastic collocation. Furthermore, we will discuss Bayesian calibration (typically using Markov chain Monte Carlo, MCMC) for "inverse UQ" problems, and surrogate modeling with Gaussian Processes (GPs). As a final topic, we will look at techniques for sensitivity analysis (especially, Sobol indices).

Study materials

Literature

  • D. Xiu, Numerical Methods for Stochastic Computations. A Spectral Method Approach. Princeton University Press, 2010.

    This is the primary book for the course. We cover chapters 3-8.

  • R.C. Smith, Uncertainty Quantification. Theory, Implementation, and Applications. SIAM, 2014.

    We cover a few sections (mostly from chapters 8 and 15) on Bayesian calibration and on sensitivity analysis. They will be made available via Canvas.

  • C.E. Rasmussen and C.K.I. Williams, Gaussian Processes for Machine Learning. The MIT Press, 2006.

    We cover sections 2.1-2.3. The book is available online via http://www.gaussianprocess.org/gpml/

Other

  • Lecture slides

Objectives

  • Construct approximations based on spectral expansions, in particular the gPC and KL expansions.
  • Explain the following UQ methods and the main steps involved in each: Stochastic Galerkin, Stochastic Collocation, NISP, GP regression.
  • Interpret the numerical results that these UQ methods give for prototype model problems.
  • Generalize these UQ methods from 1-dim to multi-dim problems.
  • Use MCMC to sample from arbitrary target distributions.
  • Assess the parameter uncertainty of model problem by using Bayesian calibration with MCMC.
  • Explain what 1st order and total order Sobol indices are, and what information they give about model sensitivities.
  • Analyze the global sensitivity of prototype models using Saltelli's algorithm for computing Sobol indices.
  • Examine the suitability of parameter settings of various UQ algorithms through experimentation.

Teaching methods

  • Presentation/symposium
  • Lecture
  • Exercises

Students work in small teams on the weekly homework exercises. Part of the exercises count towards the final grade, for these a brief report must be handed in (one per team). Each team is expected to give twice a short joint presentation on its results for an exercise.

Learning activities

Activity

Hours

 

Course meetings

28

 

Self study and exercises

140

 

Total

168

(6 EC x 28 uur)

Attendance

This programme does not have requirements concerning attendance (TER-B).

Assessment

Item and weight Details

Final grade

0.75 (75%)

Final exam assignment + discussion (oral)

0.25 (25%)

Average of 4 graded homework exercises

Presentations

Must be ≥ pass

The final exam consists of a take-home exam assignment for which students hand in a report that will be assessed and discussed individually in an oral exam. The final grade for the course is determined for 75% by the final exam grade and for 25% by the average of 4 submitted homework exercises. The final exam grade must be at least 5.0. If the exam grade is below 5.0, the final grade for the course is the exam grade (thus, a failure grade).

For the homework exercises, a single grade per team is given. Each team gives a (joint) presentation twice, this is assessed with a pass/fail grade for the team. Without a pass grade for the two presentations, no final grade for the course will be given.

In case a resit of the final exam is needed, the grades for the homework exercises and the presentations will still count towards the final grade, in the same manner as described above (i.e., 75% resit final exam grade, 25% homework, pass grade for presentations, provided the exam grade is at least 5.0). A resit will be a take-home exam assignment with discussion of the submitted report, similar to the regular final exam. No resit is possible for the homework exercises or the presentations.

Fraud and plagiarism

The 'Regulations governing fraud and plagiarism for UvA students' applies to this course. This will be monitored carefully. Upon suspicion of fraud or plagiarism the Examinations Board of the programme will be informed. For the 'Regulations governing fraud and plagiarism for UvA students' see: www.student.uva.nl

Course structure

Tentative course schedule:

Week

Date

Content

Literature

1

6 Sept

Introduction

 

2

13 Sept

Orthogonal polynomials. Approximation and interpolation with polynomials.

Xiu, ch. 3

3

20 Sept

Representing input uncertainty. Multivariate Gaussian distributions. Karhunen-Loève expansion.

Xiu, ch. 4

4

27 Sept

ODEs and PDEs with random parameters. Generalized Polynomial Chaos expansion.

Xiu, ch. 5

5

4 Oct

Multivariate gPC. Stochastic Galerkin method.

Xiu, ch. 5-6

6

11 Oct

Stochastic Galerkin method. Multivariate Stochastic Galerkin

Xiu, ch. 6

7

18 Oct

Stochastic Collocation method.

Xiu, ch. 7

 

 

 

 

Break

 

 

 

 

 

 

 

8

1 Nov

Multivariate Stochastic Collocation. Non-Intrusive Spectral Projection.

Xiu, ch. 7

9

8 Nov

Inverse UQ: Bayesian calibration and MCMC.

Xiu, 8.2; Smith, 8.1-8.3

10

15 Nov

Inverse UQ: Bayesian calibration and MCMC.

Xiu, 8.2; Smith, 8.1-8.3

11

22 Nov

Surrogate modeling. Gaussian Processes (GPs).

Rasmussen & Williams, 2.1-2.3

12

29 Nov

GPs, continued. Sensitivity Analysis: local and global SA. ANOVA.

Smith, 13.5, 15.1

13

6 Dec

Sensitivity Analysis: Sobol indices. Saltelli algorithm.

Smith, 13.5, 15.1

14

13 Dec

Sensitivity Analysis: SA with surrogates.

Smith, 13.5, 15.1

Timetable

The schedule for this course is published on DataNose.

Contact information

Coordinator

  • prof. dr. D.T. Crommelin