Course manual 2021/2022

Course content

Mathematical models in the form of PDEs or ODEs arise in many fields of science and engineering, for instance physics, chemistry, biology and climate science. The development of efficient and accurate algorithms to simulate these systems is a classical topic in numerical analysis. A much more recent topic is to investigate the impact of errors and uncertainties in e.g. model parameters, initial conditions and boundary conditions. Even with very accurate numerical algorithms, such uncertainties can have major impact and lead to large uncertainties in numerical results.

 

In this course we will look at modern mathematical techniques that have been developed to deal with uncertainties in models. In Uncertainty Quantification (UQ), a prototypical problem is to characterize the probability distribution of a model output variable given the distribution of an input parameter, and to do so in an efficient way. 

 

The focus of this course will be on the mathematical techniques and methodologies developed for UQ. We will briefly review some topics from numerical analysis (orthogonal polynomials, approximation and interpolation, quadrature). These provide a basis for several main UQ methods that we will cover: stochastic Galerkin, polynomial chaos (PC) expansion, stochastic collocation. Furthermore, we will discuss Bayesian calibration (typically using Markoc chain Monte Carlo, MCMC) for "inverse UQ" problems, and surrogate modeling with Gaussian Processes (GPs). As a final topic, we will look at techniques for sensitivity analysis (especially, Sobol indices).

Study materials

Literature

  • D. Xiu, Numerical Methods for Stochastic Computations. A Spectral Method Approach. Princeton University Press, 2010.

    This is the primary book for the course. We cover chapters 3-8.

  • R.C. Smith, Uncertainty Quantification. Theory, Implementation, and Applications. SIAM, 2014.

    We cover a few sections (mostly from chapters 8 and 15) on Bayesian calibration and on sensitivity analysis. They will be made available via Canvas.

  • C.E. Rasmussen and C.K.I. Williams, Gaussian Processes for Machine Learning. The MIT Press, 2006.

    We cover sections 2.1-2.3. The book is available online via http://www.gaussianprocess.org/gpml/

  • T.J. Sullivan, Introduction to Uncertainty Quantification. Springer, 2015.

    This book is background literature and provides more in-depth analysis of many of the course topics. It is available online via the UvA library.

Objectives

  • students can identify which orthogonal polynomials to use for standard input distributions
  • students can generalize these orthogonal polynomials to classes of distributions and to multidimensional cases
  • students can construct approximations based on the Karhunen-Loeve expansion and on the generalized PC expansion
  • students can derive the system of coupled equations in the stochastic Galerkin method
  • students can apply the stochastic Galerkin method to prototype model problems
  • students can construct approximations using stochastic collocation (SC) and non-intrusive spectral projection (NISP)
  • students can apply SC and NISP methods to prototype model problems and implement them on a computer
  • students can use MCMC to sample from distributions
  • students can perform Bayesian calibration using MCMC
  • students can explain the concept of GPs and how to use them for surrogate modeling
  • students can apply GP regression to data
  • students can describe the difference between local and global sensitivity analysis
  • students can describe what Sobol indices are and distinguish between first and total order Sobol indices
  • students can compute Sobol indices with Saltelli's algorithm

Teaching methods

  • Presentation/symposium
  • Lecture
  • Exercises

Students work in small teams on the homework exercises. Part of the exercises count towards the final grade, for these a brief report must be handed in (one per team). Each team is expected to give twice a short joint presentation on its results for an exercise.

Learning activities

Activity

Hours

 

Course meetings

28

 

Self study and exercises

140

 

Total

168

(6 EC x 28 uur)

Attendance

This programme does not have requirements concerning attendance (TER-B).

Assessment

Item and weight Details

Final grade

0.75 (75%)

Final exam

0.25 (25%)

Homework exercises

Presentations

Must be ≥ pass

The final exam consists of a take-home exam assignment for which students hand in a report that will be assessed and discussed individually. The final grade for the course is determined for 75% by the final exam grade and for 25% by the average of 4 submitted homework exercises. The final exam grade must be at least 5.0. If the exam grade is below 5.0, the final grade for the course is the exam grade (thus, a failure grade).

For the homework exercises, a single grade per team is given. Each team gives a (joint) presentation twice, this is assessed with a pass/fail grade for the team. Without a pass grade for the two presentations, no final grade for the course will be given.

In case a resit of the final exam is needed, the grades for the homework exercises and the presentations will still count towards the final grade, in the same manner as described above (i.e., 75% resit final exam grade, 25% homework, pass grade for presentations, provided the exam grade is at least 5.0). A resit will be a take-home exam assignment with discussion of the submitted report, similar to the regular final exam.

Fraud and plagiarism

The 'Regulations governing fraud and plagiarism for UvA students' applies to this course. This will be monitored carefully. Upon suspicion of fraud or plagiarism the Examinations Board of the programme will be informed. For the 'Regulations governing fraud and plagiarism for UvA students' see: www.student.uva.nl

Course structure

Tentative course schedule:

Week

Date

Content

Literature

1

6 Sept

Introduction

 

2

13 Sept

Orthogonal polynomials. Approximation and interpolation with polynomials.

Xiu, ch. 3

3

20 Sept

Representing input uncertainty. Multivariate Gaussian distributions. Karhunen-Loève expansion.

Xiu, ch. 4

4

27 Sept

ODEs and PDEs with random parameters. Generalized Polynomial Chaos expansion.

Xiu, ch. 5

5

4 Oct

Multivariate gPC. Stochastic Galerkin method.

Xiu, ch. 5-6

6

11 Oct

Stochastic Galerkin method. Multivariate Stochastic Galerkin

Xiu, ch. 6

7

18 Oct

Stochastic Collocation method.

Xiu, ch. 7

Break

 

 

 

8

2 Nov

Multivariate Stochastic Collocation. Non-Intrusive Spectral Projection.

Xiu, ch. 7

9

9 Nov

Inverse UQ: Bayesian calibration and MCMC.

Xiu, 8.2; Smith, 8.1-8.3

10

16 Nov

Inverse UQ: Bayesian calibration and MCMC.

Xiu, 8.2; Smith, 8.1-8.3

11

23 Nov

Surrogate modeling. Gaussian Processes (GPs).

Rasmussen & Williams, 2.1-2.3

12

30 Nov

GPs, continued. Sensitivity Analysis: local and global SA. ANOVA.

Smith, 13.5, 15.1

13

7 Dec

Sensitivity Analysis: Sobol indices. Saltelli algorithm.

Smith, 13.5, 15.1

14

14 Dec

Sensitivity Analysis: SA with surrogates.

Smith, 13.5, 15.1

Timetable

The schedule for this course is published on DataNose.

Contact information

Coordinator

  • prof. dr. D.T. Crommelin