Course Description
Machine learning algorithms are data analysis methods which search data sets for patterns and characteristic structures. Typical tasks are the classification of data, automatic regression and unsupervised model fitting. Machine learning has emerged mainly from computer science and artificial intelligence, and draws on methods from a variety of related subjects including statistics, applied mathematics and more specialized fields, such as pattern recognition and neural computation. Applications are, for example, image and speech analysis, medical imaging, bioinformatics and exploratory data analysis in natural science and engineering:
![]() |
![]() |
![]() |
Non-linear decision boundary of a trained support vector machine (SVM) using a radial-basis function kernel. | Fisher's linear discriminant analysis (LDA) of four different auditory scenes: speech, speech in noise, noise and music. | Gene expression levels obtained from a micro-array experiment, used in gene function prediction. |
We assume that students are familiar with the course Introduction to Machine Learning.
Announcements
- There will be no tutorials in the first week of the semester.
- The framework for the practical projects will be demonstrated in the tutorials of the third week of the semester. See the Projects section below for more information.
- All materials and links can be accessed using your NETHZ credentials.
- The lectures are offered in presence, via Zoom, and via live stream. A recording will also be made available within 24h after the lecture. To attend the lecture in presence, you must provide a Covid certificate with a QR code verifiable by the Swiss Covid Cert app. Certificates from EU countries and proof of vaccination with other vaccines on the WHO list (AstraZeneca, Sinovac, Sinopharm) will also be accepted in addition to the Swiss certificates.
- Each student is assigned to one of the four tutorials, depending on the first letter of the surname. All tutorials are offered in presence. Only the last tutorial of the week is offered online via Zoom. A recording will also be made available. We do not attend requests to change tutorials. If you cannot attend your assigned tutorial, please attend the last tutorial of the week via Zoom or watch its recording at a later time.
- During lectures, we will offer an ETH based Matrix chat for asking questions during the lecture. The chat is moderated by a teaching assistant (TA). The link to the chat is here. Please do not ask questions via Zoom.
Zoom links
Syllabus
Some of the material can only be accessed with a valid nethz account. This list of topics is intended as a guide and may change during the semester.
General Information
Times and Places
LecturesTime | Room | Remarks |
---|---|---|
Thu 15-16 | ETA F 5 | |
Fri 08-10 | HG F 1 |
Please attend only the tutorial assigned to you by the first letter of your surname. In case of collisions, please attend via Zoom the last tutorial of the week or watch its recording later. We do not attend requests to change tutorials.
Time | Room | Surname first letter |
---|---|---|
Wed 14-16 | CAB G 61 | A-G |
Wed 16-18 | CAB G 61 | H-M |
Thu 16-18 | ML F 34 | N-R |
Fri 14-16 | CAB G 61 | S-Z (offered also via Zoom to anyone) |
All tutorial sessions are identical. Attendance to the tutorials is not mandatory.
Exercises
The exercise problems will contain theoretical pen & paper assignments. Solutions are not handed in and are not graded. Solutions to the exercise problems are published one week after the exercise on this website.
Projects
The goal of the practical projects is to get hands-on experience in machine learning tasks. For further information and to access the projects, login at the projects website using your nethz credentials. You need to be within the ETH network or connected via VPN to get access.
There will be one "dummy" project (Task 0) whose purpose it is to help students familiarize with the framework we use and which will be discussed in the tutorials of the third week of the semester. Following that, there will be three "real" projects (Task 1 -- Task 3) that will be graded.
In order to complete the course, students have to pass at least two out of the three graded projects (it is recommended to participate in all three). Students who do not fulfil this requirement will not be admitted to take the final examination of the course.
The final project grade, which will constitute 30% of the total grade for the course, will be the average of the best two project grades obtained.
Release dates and submission deadlines are in (UTC time)
Release date | Submission deadline | |
---|---|---|
Task 0 (dummy task) | Mon, Oct 4, 15:00 | Mon, Oct 25,14:00 |
Task 1 | Mon, Oct 25, 15:00 | Mon, Nov 15, 14:00 |
Task 2 | Mon, Nov 15, 15:00 | Mon, Dec 6, 14:00 |
Task 3 | Mon, Dec 6, 15:00 | Mon, Jan 3, 14:00 |
Exam
There will be a written exam of 180 minutes length. The language of the examination is English. As written aids, you can bring two A4 pages (i.e., one A4 sheet of paper), either handwritten or 11 point minimum font size. The grade obtained in the written exam will constitute 70% of the total grade.
Under certain circumstances, exchange students may ask the exams office for a distance examination. This must be organized by you via the exams office with plenty of time in advance. Prof. Buhmann does not organize distance exams.
Moodle
To account for the scale of this course, we will answer questions regarding lectures exercises and projects on Moodle. To allow for an optimal flow of information, please ask your content-related questions on this platform rather than via email. In this manner, your question and our answer are visible to everyone. Consequently, please read existing question-answer pairs before asking new questions.
Text Books
C. Bishop. Pattern Recognition and Machine Learning. Springer 2006.
This is an excellent introduction to machine learning that covers most
topics which will be treated in the lecture. Contains lots of
exercises, some with exemplary solutions. Available from ETH-HDB and
ETH-INFK libraries.
R. Duda, P. Hart, and D. Stork. Pattern Classification. John Wiley & Sons, second edition, 2001.
The classic introduction to the field. An early edition is available online for students attending this class, the second edition is available from ETH-BIB and ETH-INFK libraries.
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. Deep learning. MIT Press, 2016.
T. Hastie, R. Tibshirani, and J. Friedman. The Elements of
Statistical Learning: Data Mining, Inference and
Prediction. Springer, 2001.
Another comprehensive text,
written by three Stanford statisticians. Covers additive models and
boosting in great detail. Available from ETH-BIB and ETH-INFK
libraries. A
free pdf version is available.
Mohri, Mehryar, Afshin Rostamizadeh, and Ameet Talwalkar. Foundations of machine learning. MIT press, 2018.
L. Wasserman. All of Statistics: A Concise Course in Statistical Inference. Springer, 2004.
This book is a compact treatment of statistics that facilitates a deeper
understanding of machine learning methods. Available from ETH-BIB and
ETH-INFK libraries.
D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012.
This book is a compact and extensive treatment of most topics. Available for personal use online: Link.
K. Murphy. Machine Learning: A Probabilistic Perspective. MIT, 2012.
Unified probabilistic introduction to machine learning. Available from ETH-BIB and ETH-INFK libraries.
S. Shalev-Shwartz, and S. Ben-David. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.
This recent book covers the mathematical foundations of machine learning. Available for personal use online: Link.
Exams from previous years
Contact
Please ask questions related to the course using Moodle, not via email.
Instructors:
Prof. Joachim M. Buhmann,
Dr. Carlos Cotrini