Syllabus for CMPUT 466 / 566
Time and Location
Tuesday and Thursday, 12:30 - 1:50 p.m.
HC L 1
Office: ATH 3-05
(in alphabetical order)
- Amir Akbarnejad: firstname.lastname@example.org
- Paritosh Goyal: email@example.com
- Fernando Juan Hernandez: firstname.lastname@example.org
- Raksha Kumaraswamy: email@example.com
- Yangchen Pan: firstname.lastname@example.org
- Andrew Patterson: email@example.com
- Matthew Schlegel: firstname.lastname@example.org
- Michael Strobl: email@example.com
The TAs are fantastic, and knowledgeable in machine learning; you should definitely ask them questions if you are stuck or to further your knowledge. At the same time, please respect all TAs time. There is a large class, and you should restrict meetings with TAs to about 15 minutes at a time (no more than 30 minutes).
- Martha: Thursday from 2:00 p.m. - 4:00 p.m., in ATH 3-05
Lab times and locations
Labs are not mandatory, and will basically be run as office hours with the TAs. Each week the TAs might present some background material, and any clarifications on material or assignments. The TAs will also supplement with office hours outside this time, if needed.
- Monday, 5:00 p.m. - 7:50 p.m., T B 95
- Wednesday, 5:00 p.m. - 7:50 p.m., T B 95
The course objective is to study the theory and practice of constructing algorithms that learn (functions) from data. Machine learning is a field with goals overlapping with other disciplines, in particular, statistics, algorithms, engineering, or optimization theory. It also has wide applications to a number of scientific areas such as finance, life sciences, social sciences, or medicine.
You are expected to be comfortable with programming, and to have background in probability and linear algebra. The programming assignments will be in Python.
Main notes will be provided in class.
- More in-depth reference: Pattern Recognition and Machine Learning - by C. M. Bishop, Springer 2006.
- Less technical reference: An Introduction to Statistical Learning: with Applications in R - by James et al.
- In-depth reference, covering a broader range of topics and with good exercises (free online): Bayesian Reasoning and Machine Learning - by Barber
- Theory-oriented reference: The Elements of Statistical Learning - by T. Hastie, R. Tibshirani, and J. Friedman, 2009
- Thought questions: 10%
- Midterm exam: 20%
- Final exam: 35%
- Homework assignments (3): 25%
- Initial draft for mini-project: 5%
- Final mini-project write-up: 5%
Graduate students (in 566) will have additional questions on the midterm, final and assignments. These questions will be bonus questions for undergraduate students (in 466).
- mathematical foundations of machine learning
- random variables and probabilities
- optimization basics
- overview of machine learning
- supervised, semi-supervised, unsupervised learning
- basics of parameter estimation
- maximum likelihood and maximum a posteriori
- linear regression
- generalized linear models
- linear classification
- logistic regression
- naive Bayes
- support vector machines
- representations and representation learning
- neural networks
- sparse coding
- dictionary learning
- kernel methods
- bias-variance trade-off
- theoretical evaluation
- Rademacher complexity
- empirical evaluation
- cross validation and resampling
- statistical significance tests
- Bayesian linear regression
- decision trees
Late Policy and Academic Honesty
All assignments and exams are individual, except when collaboration is explicitly allowed. All the sources used for problem solution must be acknowledged, e.g. web sites, books, research papers, personal communication with people, etc. Academic honesty is taken seriously; for detailed information see https://www.deanofstudents.ualberta.ca/en/AcademicIntegrity.aspx.