AI 534 (400/401), Machine Learning (e-campus), Fall 2024

“Equations are just the boring part of mathematics. I attempt to see things in terms of geometry.”
-- Stephen Hawking (1942--2021)

Coordinates [Canvas] [Registrar] [Ed Discussion Forum]
Instructor Liang Huang (liang.huang@oregonstate.edu).
TAs Zetian Wu (wuzet@oregonstate.edu)
Milan Gautam (gautammi@oregonstate.edu)
Office Hours M, W, Th, F; exact slots TBD
Zoom link (no passcode needed)
Prerequisites
  • CS: algorithms and datastructures, and Python proficiency.
  • Math: very basic linear algebra.
Textbooks
  • Our notes below (default reference; this course is self-contained).
  • Daume. A Course in Machine Learning (CIML).
  • Bishop (2007). Pattern Recognition and Machine Learning (PRML). Actually I do not recommend it for beginners. But the figures are pretty and I use them in my slides.
Grading
  • Background survey (on Canvas): each student gets 2% by submitting on time.
  • Quizzes (on Canvas, autograded): 10% + 8% = 18%. everybody has two attempts on each quiz.
  • HWs 1-4 (programming, Kaggle competitions): 20% + 15% + 15% + 15% = 65%. In Python + numpy + sklearn.
  • HW5: Paper review: 15%. cutting-edge machine learning research.

  • HWs are due on Mondays; Quizzes are due on Fridays.
  • Late Penalty: Each student can be late by 24 hours only once without penalty. No more late submissions will be accepted.

Machine Learning evolves around a central question: How can we make computers to learn from experience, without being explicitly programmed? In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, accurate spam filters, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that everybody uses it dozens of times a day without knowing it.

This course will survey the most important algorithms and techniques in the field of machine learning. The treatment of mathematics will be rigorous, but unlike most other machine learning courses which feature tons of equations, my version will focus on the geometric intuitions and the algorithmic perspective. I will try my best to visualize every concept.

Even though machine learning appears to be "mathy" on the surface, it is not abstract in any sense, unlike mainstream CS (algorithms, theory of computation, programming languages, etc.). In fact, machine learning is so applied and empirical that it is more like alchemy. So we will also discuss practical issues and implementation details.
Some preparatory materials:
Weekly Materials
Unit 1 (weeks 1-3): ML intro, \(k\)-NN, and math/numpy review
1.0Introduction
1.1Machine Learning Settings
1.2Basic Machine Learning Concepts
1.3Nearest Neighbor Classifier
1.4Linear Algebra and Numpy Tutorials
HW1\(k\)-NN for income classification [pdf] [data] [kaggle]
Unit 2 (weeks 4-5): linear classification and perceptron
2.1History of Perceptron
2.2Linear Classification
2.3The Perceptron Algorithm
2.4Convergence Theorem and Proof
2.5Inseparable Cases and Feature Engineering
2.6Voted and Averaged Perceptrons
HW2perceptron for sentiment [pdf] [data] [kaggle]
Unit 3 (weeks 6-7): linear and polynomial regression
3.1Linear Regression
3.2Regularize
3.3Gradient Descent
3.4Normal Equation
3.5Nonlinear Regression
HW3regression for housing price prediction [pdf] [data] [kaggle]
Unit 4 (weeks 8-9): a taste of deep learning
4.1Multilayer Neural Networks
4.2Word Embeddings
HW4redo HW2 with word embeddings [pdf] (HW2 data + embeddings) [kaggle]
Unit 5 (week 10): paper review (cutting-edge ML)
HW5paper review (see the list of papers on Canvas)

Classical Papers:
Tech Giants Are Paying Huge Salaries for Scarce A.I. Talent (New York Times)
Liang Huang