Course syllabus
Introduktion till artificiella neuronnätverk och deep learning
Introduction to Artificial Neural Networks and Deep Learning
EXTQ40, 7,5 credits, A (Second Cycle)
Valid for: 2023/24
Faculty: Faculty of Engineering, LTH
Decided by: PLED F/Pi
Date of Decision: 2023-04-18
General Information
Main field: Machine Learning, Systems and Control.
Compulsory for: MMSR1
Elective for: BME4-sbh, C4, D4-bg, D4-mai, E4-ss, F4, F4-tf, F4-mai, I4, MFOT1, N4, Pi4-ssr, Pi4-bam
Language of instruction: The course will be given in English on demand
Aim
The overall aim of the course is to give students a basic
knowledge of artificial neural networks and deep learning, both
theoretical knowledge and how to practically use them for typical
problems in machine learning and data mining.
Learning outcomes
Knowledge and understanding
For a passing grade the student must
- be able to describe the construction of the multi-layer
perceptron
- be able to describe different error functions used for training
and techniques to numerically minimize these error functions
- be able to explain the concept of overtraining and describe
those properties of a neural network that can cause
overtraining
- be able to describe the construction of different types of deep
neural networks
- be able to describe neural networks used for time series
analysis as well as for selforganization.
Competences and skills
For a passing grade the student must
- be able to produce update equations for a multi-layer
perceptron with given specific error and activation functions
- be able to prove basic properties of the multi-layer
perceptron, such as non-linearity, probability interpretation of
the output and the advantage of using an ensemble of neural
networks
- be able to implement a multi-layer perceptron to solve a
typical classification or regression problem, including systematic
choice of suitable model parameters to optimize the generalization
performance
- be able to show how to use a convolutional neural network to
classify images, including suitable choices of layers and kernel
sizes
- be able to use a recurrent network, both deep and shallow, for
time series problems.
Judgement and approach
For a passing grade the student must
- be able to analyse a typical problem within the subject area
and deduce which method or methods that are most suitable to solve
it
- be able to identify possible loopholes in an analysis that can
affect its reproducibility.
Contents
The course covers the most common models in artificial neural
networks with a focus
on the multi-layer perceptron. The course also provides an
introduction to deep
l earning. Selected topics:
- Feed-forward neural networks: the simple perceptron and the
multi-layer perceptron, choice of suitable error functions and
techniques to minimize them, how to detect and avoid overtraining,
ensembles of neural networks and techniques to create them,
Bayesian training of multi-layer perceptrons
- Recurrent neural networks: simple recurrent networks and their
use in time series analysis, fully recurrent for both time series
analysis and associative memories (Hopfield model), the simulated
annealing optimization technique
- Self-organizing neural networks: networks that can extract
principal components, networks for data clustering, learning vector
quantization (LVQ), self-organizing feature maps (SOFM)
- Deep learning: Overview of deep learning, convolutional neural
networks for classification of images, different techniques to
avoid overtraining in deep networks, techniques to pre-train deep
networks.
Examination details
Grading scale: TH - (U,3,4,5) - (Fail, Three, Four, Five)
Assessment: The examination consists of a written reports on the mandatory computer exercises and an oral or written test at the end of the course.
The examiner, in consultation with Disability Support Services, may deviate from the regular form of examination in order to provide a permanently disabled student with a form of examination equivalent to that of a student without a disability.
Parts
Code: 0122. Name: Test.
Credits: 6. Grading scale: TH. Assessment: Written or oral examination.
Code: 0222. Name: Computer Exercises.
Credits: 1,5. Grading scale: UG. Assessment: Approved computer exercises.
Admission
Admission requirements:
- FMAB20 Linear Algebra
- FMAB30 Calculus in Several Variables or FMAB35 Calculus in Several Variables
- FMAA01 Calculus in One Variable or FMAA05 Calculus in One Variable or FMAB65 Calculus in One Variable B1
- FMAA01 Calculus in One Variable or FMAA05 Calculus in One Variable or FMAB70 Calculus in One Variable B2
The number of participants is limited to: 250
Selection: Completed university credits within the program. Priority is given to students enrolled on programmes that include the course in their curriculum. Among these students priority is given to those in the master's programme in Machine Learning, Systems and Control, for whom the course is compulsory.
The course overlaps following course/s: FYTN14
Reading list
- As posted on our webpage and billboard.
Contact and other information
Teacher: Mattias Ohlsson, mattias@thep.lu.se
Course coordinator: Patrik Edén, patrik.eden@thep.lu.se
Course homepage: https://canvas.education.lu.se/courses/8091
Further information: The course is given by the Faculty of Science and does not follow the study period structure. The course will partly be held online.