Neurale Netwerken
6 EC
Semester 2, period 4
5082NENE6Y
The Neural Networks course introduces second-year BSc Artificial Intelligence students to the principles, mechanisms, and architectures underlying modern neural computation. The course progresses from early biologically inspired models and classical perceptrons to contemporary deep learning systems that form the backbone of current AI practice.
Students first develop a solid conceptual and mathematical understanding of feedforward neural networks, including activation functions, multilayer perceptrons, and the backpropagation algorithm. The course then addresses the practical realities of training neural networks, covering optimization, regularization, and best practices for building reliable and reproducible learning systems.
Building on these foundations, the course introduces major neural architectures used in modern AI, including convolutional neural networks, recurrent neural networks, and transformer models. The final part of the course situates these developments in a broader historical and biological context, examining Hebbian learning, Hopfield networks, Boltzmann machines, and other influential models that shaped the evolution of neural network research.
Throughout the course, theoretical insight is balanced with practical experience through hands-on exercises and supervised training sessions. Students completing the course will be prepared to critically understand, implement, and evaluate neural network models, and to engage with more advanced coursework and research in artificial intelligence.
https://uvadlc.github.io/lectures-nov2020.html#
https://uvadlc.github.io/
https://www.deeplearningbook.org/
Lecture 1: Introduction to neural networks — history, motivation, and the biological metaphor
Lecture 2: The perceptron and multilayer perceptrons (MLPs)
Lecture 3: Activation functions, layers, and parameters
Lecture 4: Backpropagation — intuition, derivation, and computational graph perspective
Lecture 5: Fundamentals of optimization — gradient descent, momentum, adaptive methods (Adam, RMSProp), loss surfaces. Challenges and techniques — vanishing/exploding gradients, initialization, learning-rate scheduling, convergence diagnostics
Lecture 6: Regularization methods — L1/L2 penalties, dropout, batch normalization, early stopping, data augmentation, capacity control
Lecture 7: Practical considerations and good practices — training pipelines, debugging, reproducibility, validation strategies, and tuning heuristics
Lecture 8: Convolutional Neural Networks (CNNs) and Residual Networks
Lecture 9: Recurrent Neural Networks (RNNs) and sequence modeling
Lecture 10: Transformers and attention mechanisms — from sequence models to large-scale architectures
Lecture 11: Generative models — autoencoders, VAEs, GANs, and diffusion models (conceptual overview)
Lecture 12: Hebbian learning, biological principles of learning, and the evolution from biological to artificial networks
Lecture 13: Historical neural networks — Hopfield, Boltzmann, and Echo State Networks; the perceptron controversy, connectionism revival, and universal approximation theorem
Lecture 14: Q&A / exam prep
Activity | Hours | |
Hoorcollege | 24 | |
Tentamen | 4 | |
Werkcollege | 22 | |
Self study | 118 | |
Total | 168 | (6 EC x 28 uur) |
Programme's requirements concerning attendance (TER-B Article B-4.10):
Additional requirements for this course:
The attendance in the lectures and the practicals is not mandatory but it is recommended. The practical quizzes and the final exam will include questions and topics discussed in the lectures and the practicals.
| Item and weight | Details |
|
Final grade | |
|
0% Tentamen 1 (cancelled) |
Grading
| lab attendance | quiz attendance | professionalism grade* |
|---|---|---|
| 50% and above | 3 out of 3 | 10 |
| 30% - 50% | at least 2 in 3 | 7 |
| 10 % - 30% | at least 1 in 3 | 5 |
| 10% or lower | 0 | 1 |
| * grade is obtained if both criteria are met, if either is lower the grade will be averaged. |
Answers will be uploaded in Ans. The students can discuss results in the practical sessions.
The practical quizzes will be carried out individually. The practical quizzes will contain questions that relate to the assignment, including technical questions on how parts of the assignment are implemented, as well as analytical questions on the results of the assignment. The highest 2 out of 3 scores count for the final grade, so as to account for sickness, absence, and so forth. No other opportunities in case you miss a quiz.
The poster presentation will be organized during a poster day. The posters will be graded group-wise with respect to clarity, communication, and understanding of the subject matter and the work done.
The final exam will contain theoretical and practical questions on the subject material from all lectures.
The 'Regulations governing fraud and plagiarism for UvA students' applies to this course. This will be monitored carefully. Upon suspicion of fraud or plagiarism the Examinations Board of the programme will be informed. For the 'Regulations governing fraud and plagiarism for UvA students' see: www.student.uva.nl
The study material will be simplified versions of
https://uvadlc.github.io
| Weeknummer | Onderwerpen | Who |
| 1 | Lecture 1: Introduction to neural networks — history, motivation, and the biological metaphor | Stratis |
| 2 | Lecture 2: The perceptron and multilayer perceptrons (MLPs) | Stratis |
| 3 | Lecture 3: Activation functions, layers, and parameters | Pascal |
| 4 | Lecture 4: Backpropagation — intuition, derivation, and computational graph perspective | Stratis |
| 5 | Lecture 5: Fundamentals of optimization — gradient descent, momentum, adaptive methods (Adam, RMSProp), loss surfaces. Challenges and techniques — vanishing/exploding gradients, initialization, learning-rate scheduling, convergence diagnostics | Pascal |
| 6 | Lecture 6: Regularization methods — L1/L2 penalties, dropout, batch normalization, early stopping, data augmentation, capacity control | Stratis |
| 7 | Lecture 7: Practical considerations and good practices — training pipelines, debugging, reproducibility, validation strategies, and tuning heuristics | Pascal |
| 8 | Lecture 8: Convolutional Neural Networks (CNNs) and Residual Networks | Pascal |
| 9 | Lecture 9: Recurrent Neural Networks (RNNs) and sequence modeling | Stratis |
| 10 | Lecture 10: Transformers and attention mechanisms — from sequence models to large-scale architectures | Pascal |
| 11 | Lecture 11: Generative models — autoencoders, VAEs, GANs, and diffusion models (conceptual overview) | Stratis |
| 12 | Lecture 12: Hebbian learning, biological principles of learning, and the evolution from biological to artificial networks | Iris Groen |
| 13 | Lecture 13: Historical neural networks — Hopfield, Boltzmann, and Echo State Networks; the perceptron controversy, connectionism revival, and universal approximation theorem | Stratis |
| 14 | Q&A / exam prep | Stratis/Pascal |
The plan for practicals and the poster presentation are:
| 1. | Quiz for perceptrons | Feb 16, 2026 |
| 2 | Quiz for optimizing Neural Nets | Mar 9, 2026 |
| 3 | Overall quiz | Mar 16, 2026 |
| 3 | Poster presentation of project (groups choose between an image and text task) | Mar 24, 2026 |
Co-coordinator: Pascal Mettes (p.s.m.mettes@uva.nl)