CSCI 567
Machine Learning
A crash course on machine learning

Spring 2016

Basic Information

Introduction

The main objective of this course is to teach fundamental techniques in machine learning. Key components include statistical learning approaches, including but not limited to various parametric and nonparametric methods for supervised and unsupervised learning problems. Particular focuses on the theoretical understanding of these methods, as well as their computational implications.

Recommended Preparation

Undergraduate level training or coursework in linear algebra, calculus and multivariate calculus, basic probability and statistics; an undergraduate level course in artificial intelligence may be helpful but is not required.

Notes on Entrance Exam

As you may have already known, there will be an entrance exam at the first day of the class (Jan 11, 2016). The exam is to survey the students knowledge on basic concepts required for machine learning techniques so that we can better prepare the teaching materials.

Due to the sitting limitation and the large number of students who are interested in this course, we may not have enough seats for those who are on the waiting list. Therefore we will host the exam on a first-come-first-serve basis. To help you to prepare the exam, you can review the following topics - linear algebra, calculus, basic probability and statistics.

The exam is closed-book. We will distribute the exam and answer sheets. You only need to bring writing tools (pens or pencils). You will need to take the exam, whether you are registered or still on waiting list. There is no exception to the rule.

A sample quiz is available here.

Recommended Textbooks

There will be no required textbooks. However, we suggest one of the following to help you to study:

We will mark suggested readings from these two books.

News

Syllabus

Date Topics Assignments (tentative)
1/11/2016 Entrance Exam, Overview of ML, Review of Basic Math Concepts
1/18/2016 University Holiday - Martin Luther King Day
1/25/2016 Density Estimation, Nearest Neighbors, Linear Regression HW#1 out
2/1/2016 Decision trees, Naïve Bayes, Logistic Regression
2/8/2016 Linear/Gaussian Discriminant Analysis, Overfitting and Regularization, Bias-variance tradeoff HW#1 due, HW#2 out
2/15/2016 University Holiday - Presidents’ Day
2/22/2016 Kernel Methods, SVM HW#2 due, HW#3 out
2/29/2016 Geometric Understanding of SVM, Boosting
3/7/2016 Quiz 1 , Pragmatics: Comparing and evaluating classifiers
3/14/2016 Spring break - Holiday HW#3 due
3/21/2016 Neural Networks and Deep Learning Mini Project details out, HW#4 out
3/28/2016 Clustering, Mixture models, EM algorithm
4/4/2016 Large-scale Learning for Big Data, Dimensionality Reduction HW#4 due, HW#5 out
4/11/2016 Kernal PCA, HMM Mid-term report due
4/18/2016 Graphical Models, Recommender Systems, Course Review HW#5 due
4/25/2016 Quiz 2 (Last week of classes)
5/2/2016 Mini Project Mini Project - Final Report due

Readings