CS534: Machine Learning--Syllabus
Personnel
Instructor:
Tom Dietterich, Dearborn 221C, 737-5559, tgd@cs.orst.edu
Office Hours: Thursday 9:00-10:00am
Grader: Charles Parker
Meeting Times
MWF 9:00-10:00am Batchellor 250
Text
Pattern Classification by Duda, Hart, and Stork.
Goals
When you have completed this course, you should be able to apply
machine learning algorithms to solve machine learning algorithms
for both iid and sequential data problems of moderate complexity.
You should also be able to read current research papers in machine
learning and understand the issues raised by current research in
supervised learning.
Prerequisites
Knowledge of machine learning algorithms from CS531 (Bayesian
Networks (including the EM algorithm for learning with hidden
variables); Decision trees). Basic knowledge of data structures,
search algorithms (gradient descent, depth-first search, greedy
algorithms), calculus, probability.
Grading
Homework 50%
MidTerm 20%
Final 30%
Written Homework and Programs are due at the beginning of class.
Each student is responsible for his/her own work. The standard
departmental rules for academic dishonesty apply to all
assignments in this course. Collaboration on homeworks and
programs should be limited only to answering questions that can be
asked and answered without using any written medium (e.g., no
pencils, instant messages, or email).
Turning In Programming Assignments
You will turn in your solutions to programming problems both
electronically and as a hardcopy in class.
We are using the ENGR
homework system for turning in assignments electronically.
PREDICTED COURSE SCHEDULE
INTRODUCTION: Linear Threshold Classifiers (part1)
Mar 28 Introduction
30 Space of Algorithms
Apr 1 Perceptrons
4 Logistic Regression
THE TOP 5 ALGORITHMS
6 Linear Discriminant Analysis
8 Off-The-Shelf Learning Algorithms
11 Decision Trees (part2)
13 Decision Trees (continued); Nearest Neighbor (part3)
15 Nearest Neighbor; Neural networks (part4)
18 Neural networks;
20 Support Vector Machines (part5)
23 Naive Bayes (part6)
LEARNING THEORY
25 PAC Learning Theory (part7)
27 PAC Learning Theory (continued)
29 Bayesian Learning Theory (part8), Bias/Variance Theory (part9)
May 2 Bias/Variance Theory and Ensemble Methods (part9)
4 Bias/Variance Theory and Ensemble Methods (continued)
OVERFITTING AVOIDANCE
6 Penalty Methods for Preventing Overfitting (part10)
9 MIDTERM EXAM
11 Hold-Out and Cross-validation Methods (part10)
Hold-Out. Pessimistic pruning.
13 Penalty methods for Neural Nets and SVMs
16 Hold-Out methods for trees, networks, nearest neighbor, SVMs (part11)
SEQUENTIAL SUPERVISED LEARNING
18 Introduction; Hidden Markov Models (part12)
20 Conditional Random Fields (part12)
23 Discriminative Methods (part12)
25 Research Issues
METHODOLOGY
27 Evaluating and Comparing Classifiers (part13)
30 Memorial Day: No Class
June 1 Evaluation continued (part13)
3 Course Summary
Jun 6 18:00 Final Exam
Tom Dietterich, tgd@cs.orst.edu