Course syllabus
Informationsteori
Information Theory
EITN45, 7,5 credits, A (Second Cycle)
Valid for: 2012/13
Decided by: Education Board 1
Date of Decision: 2012-03-19
General Information
Main field: Communication Systems.
Elective for: C4, C4-ks, D4, D4-ks, E4, E4-ks, E4-f, F4, MFOT1, MWIR1, Pi4, Pi4-ssr
Language of instruction: The course will be given in English on demand
Aim
The aim of this course is to give the students knowledge of
principles for information storage and transmission of information,
and the use of binary representation of information. The course
also gives knowledge of the prestanda and fundamental boundaries of
todays and tomorrows communication systems.
Learning outcomes
Knowledge and understanding
For a passing grade the student must
- be able to identify and formulate problems within the area of
Information Theory
- be able to classify the level of difficulty of problems related
to the his/her own level of knowledge
Competences and skills
For a passing grade the student must
- be able to show ability to handle new methods and
results.
- be able to set up requirements on implementation of algoritms
in the course.
- be able to realize systems for the algorithms presented in the
course.
Judgement and approach
For a passing grade the student must
- be able to classify the level of difficulty of problems related
to the his/her own level of knowledge
- be aware of what parameters set up the boundaries for reliable
communicationa as well as the compression ratio of a source.
Contents
The definition of information goes back to Shannons landmark
paper in 1948. His theory of how information can be processed is
the basis of all efficient digital communication systems both today
and tomorrow. This course provides an up-to-date introduction to
topic information theory. The course emphasizes both the formal
development of the theory and the engineering implications for the
design of communication systems and other information handling
systems. The course includes:
- Shannon's information measure and its relatives, both for the
discrete and continuous case.
- Three fundamental information theorems: Typical sequences,
Source coding theorem and Channel coding theorem.
- Source coding: Optimal coding and construction of Huffman
codes, as well as universal souce coding such as Ziv-Lempel coding
(zip, etc.).
- Channel coding: Principles of error detection and correction on
a noisy channel, mainly illustrated by Hamming codes.
- Gaussian channel: Continuous sources and additive white noise
over both band limited and frequency selective channels. Derivation
of the fundamental Shanon limit.
- Rate distortion theory: Source coding theorem and the
fundamental limit revisited when a certain level of distortion is
allowed.
Examination details
Grading scale: TH
Assessment: Hand in problems and home exam.
Admission
Required prior knowledge: FMS012 Mathematical Statistics, Basic Course.
ETT051 Digital Communications
The number of participants is limited to: No
The course overlaps following course/s: EIT080
Reading list
- Thomas Cover and Joy Thomas: Elements of Information Theory (2nd Ed). Wiley, 2006, ISBN: 13-978-0-471-24195-9.
- Additional material.
Contact and other information
Course coordinator: Universitetslektor Stefan Höst, stefan.host@eit.lth.se