Course manual 2025/2026

Course content

The Neural Networks course introduces second-year BSc Artificial Intelligence students to the principles, mechanisms, and architectures underlying modern neural computation. The course progresses from early biologically inspired models and classical perceptrons to contemporary deep learning systems that form the backbone of current AI practice.

Students first develop a solid conceptual and mathematical understanding of feedforward neural networks, including activation functions, multilayer perceptrons, and the backpropagation algorithm. The course then addresses the practical realities of training neural networks, covering optimization, regularization, and best practices for building reliable and reproducible learning systems.

Building on these foundations, the course introduces major neural architectures used in modern AI, including convolutional neural networks, recurrent neural networks, and transformer models. The final part of the course situates these developments in a broader historical and biological context, examining Hebbian learning, Hopfield networks, Boltzmann machines, and other influential models that shaped the evolution of neural network research.

Throughout the course, theoretical insight is balanced with practical experience through hands-on exercises and supervised training sessions. Students completing the course will be prepared to critically understand, implement, and evaluate neural network models, and to engage with more advanced coursework and research in artificial intelligence.

Study materials

Literature

  • https://uvadlc.github.io/lectures-nov2020.html#

  • https://uvadlc.github.io/

     

  • https://www.deeplearningbook.org/

Syllabus

  • Lecture 1: Introduction to neural networks — history, motivation, and the biological metaphor
    Lecture 2: The perceptron and multilayer perceptrons (MLPs)
    Lecture 3: Activation functions, layers, and parameters
    Lecture 4: Backpropagation — intuition, derivation, and computational graph perspective
    Lecture 5: Fundamentals of optimization — gradient descent, momentum, adaptive methods (Adam, RMSProp), loss surfaces. Challenges and techniques — vanishing/exploding gradients, initialization, learning-rate scheduling, convergence diagnostics
    Lecture 6: Regularization methods — L1/L2 penalties, dropout, batch normalization, early stopping, data augmentation, capacity control
    Lecture 7: Practical considerations and good practices — training pipelines, debugging, reproducibility, validation strategies, and tuning heuristics
    Lecture 8: Convolutional Neural Networks (CNNs) and Residual Networks
    Lecture 9: Recurrent Neural Networks (RNNs) and sequence modeling
    Lecture 10: Transformers and attention mechanisms — from sequence models to large-scale architectures
    Lecture 11: Generative models — autoencoders, VAEs, GANs, and diffusion models (conceptual overview)
    Lecture 12: Hebbian learning, biological principles of learning, and the evolution from biological to artificial networks
    Lecture 13: Historical neural networks — Hopfield, Boltzmann, and Echo State Networks; the perceptron controversy, connectionism revival, and universal approximation theorem
    Lecture 14: Q&A / exam prep

Practical training material

Objectives

  • The student knows what is a perceptron, an multi-layer perceptron (MLP)
  • The student knows what is an nonlinear activation function, neural network layers and modules, and neural network parameters
  • The student knows the backpropagation algorithm
  • The student knows optimization and regularization algorithms for training the neural network
  • The student knows convolutional neural network (CNN), a deep neural network (DNN), and transformer architectures
  • The student knows recurrrent neural network
  • The student knows Hebbian learning rule, Hopfield networks, Boltzmann Machines

Teaching methods

  • Lecture
  • Computer lab session/practical training
  • Presentation/symposium
  • Working independently on e.g. a project or thesis
  • Thirteen lectures that teach the theoretical components of the ILOs.
  • Practical sessions that discuss two practical assignments. Instead of grading practical assignments, there are two in-class quizzes:
    • February 16, 2026
    • March 9, 2026
    • March 16, 2026
    • The quizzes have questions regarding the practical assignment answered on paper.
    • The quizzes are one hour long. To answer the quizzes comfortably, following the lectures and the practical assignments is advised. Their goal is that students learn to implement in practice neural networks following the coding assignments.
    • The highest 2 out of 3 scores count for the final grade, so as to account for sickness, absence, and so forth. No other opportunities in case you miss a quiz.
  • A project to develop a neural network to solve an imaging or a text task, so as to apply in practice the knowledge acquired. The project will include a leaderboard to track the best scores. The project will be evaluated by a final poster presentation during a class-wide poster session on March 24.
  • A final exam that assesses overall knowledge.

Learning activities

Activity

Hours

Hoorcollege

24

Tentamen

4

Werkcollege

22

Self study

118

Total

168

(6 EC x 28 uur)

Attendance

Programme's requirements concerning attendance (TER-B Article B-4.10):

  • For some course component attendance is obligatory. If attendance is required, this is stated in the course catalogue. The reasons for, and the implementation of, this attendance requirement may vary by course and are included in the course manual. Students who do not meet this attendance requirement cannot complete the course with a passing grade.

Additional requirements for this course:

The attendance in the lectures and the practicals is not mandatory but it is recommended. The practical quizzes and the final exam will include questions and topics discussed in the lectures and the practicals.

Assessment

Item and weight Details

Final grade

0%

Tentamen 1 (cancelled)

Grading

  • 50% for the final exam
  • 25% for the practical quizzes
    • 12.5% each quiz
    • The quizzes are filled in individually on paper during the practical assignments. They are handed out inside the class only.
  • 20% for the poster presentation of the project. Each project will be worked on by a group of 3 students maximum.
  • 5% for professional communication, including participation in lectures and practicals, according to the following rules:
lab attendance quiz attendance professionalism grade*
50% and above 3 out of 3 10
30% - 50% at least 2 in 3 7
10 % - 30% at least 1 in 3 5
10% or lower 0 1
     
  * grade is obtained if both criteria are met, if either is lower the grade will be averaged.  

Inspection of assessed work

Answers will be uploaded in Ans. The students can discuss results in the practical sessions.

Assignments

The practical quizzes will be carried out individually. The practical quizzes will contain questions that relate to the assignment, including technical questions on how parts of the assignment are implemented, as well as analytical questions on the results of the assignment. The highest 2 out of 3 scores count for the final grade, so as to account for sickness, absence, and so forth. No other opportunities in case you miss a quiz.

The poster presentation will be organized during a poster day. The posters will be graded group-wise with respect to clarity, communication, and understanding of the subject matter and the work done.

The final exam will contain theoretical and practical questions on the subject material from all lectures.

Fraud and plagiarism

The 'Regulations governing fraud and plagiarism for UvA students' applies to this course. This will be monitored carefully. Upon suspicion of fraud or plagiarism the Examinations Board of the programme will be informed. For the 'Regulations governing fraud and plagiarism for UvA students' see: www.student.uva.nl

Course structure

The study material will be simplified versions of 

Weeknummer Onderwerpen Who
1 Lecture 1: Introduction to neural networks — history, motivation, and the biological metaphor Stratis
2 Lecture 2: The perceptron and multilayer perceptrons (MLPs) Stratis
3 Lecture 3: Activation functions, layers, and parameters Pascal
4 Lecture 4: Backpropagation — intuition, derivation, and computational graph perspective Stratis
5 Lecture 5: Fundamentals of optimization — gradient descent, momentum, adaptive methods (Adam, RMSProp), loss surfaces. Challenges and techniques — vanishing/exploding gradients, initialization, learning-rate scheduling, convergence diagnostics Pascal
6 Lecture 6: Regularization methods — L1/L2 penalties, dropout, batch normalization, early stopping, data augmentation, capacity control Stratis
7 Lecture 7: Practical considerations and good practices — training pipelines, debugging, reproducibility, validation strategies, and tuning heuristics Pascal
8 Lecture 8: Convolutional Neural Networks (CNNs) and Residual Networks Pascal
9 Lecture 9: Recurrent Neural Networks (RNNs) and sequence modeling Stratis
10 Lecture 10: Transformers and attention mechanisms — from sequence models to large-scale architectures Pascal
11 Lecture 11: Generative models — autoencoders, VAEs, GANs, and diffusion models (conceptual overview) Stratis
12 Lecture 12: Hebbian learning, biological principles of learning, and the evolution from biological to artificial networks Iris Groen
13 Lecture 13: Historical neural networks — Hopfield, Boltzmann, and Echo State Networks; the perceptron controversy, connectionism revival, and universal approximation theorem Stratis
14 Q&A / exam prep Stratis/Pascal

The plan for practicals and the poster presentation are:

1.  Quiz for perceptrons Feb 16, 2026
2 Quiz for optimizing Neural Nets Mar 9, 2026
3 Overall quiz Mar 16, 2026
3 Poster presentation of project (groups choose between an image and text task) Mar 24, 2026

Contact information

Coordinator

  • dr. E. Gavves

Co-coordinator: Pascal Mettes (p.s.m.mettes@uva.nl)