STAT 890 / 442, CM 462,
Department of Statistics and Actuarial Science
University of Waterloo
Room: B2 350
Time: MWF 1:30-2:20
Office Hours: Monday 3:00pm --4:00pm or by appointment (MC 6081G)
Geometric methods for feature extraction and dimensional reduction. C. J. C. Burges.
Kernel PCA pattern reconstruction
Nonlinear Component Analysis as a Kernel Eigenvalue Problem,
Locally Linear Embedding (LLE):
Think Globally, Fit Locally: Unsupervised Learning of Nonlinear Manifolds.
A Global Geometric Framework for Nonlinear Dimensionality Reduction. Joshua B. Tenenbaum, Vin de Silva, and John C. Langford Science 22 December 2000 290: 2319-2323
MDS, Landmark MDS and Nystrom Approximation:
FastMap, MetricMap, and Landmark MDS are all Nystrom Algorithms
Sparse multidimensional scaling using landmark points
Semidefinite Embedding (SDE):
Learning a kernel matrix for nonlinear
dimensionality reduction. ,K. Q. Weinberger, F. Sha, and L. K. Saul
In Proceedings of the Twenty First International Conference on Machine Learning (ICML-04). (pdf)
Action Respecting Embedding (ARE):
Clustering (Impossibility Theorem):
An Impossibility Theorem for Clustering.
Distance metric learning with application to clustering with side-information. E. Xing, A. Ng, M. Jordan and S. Russell. In Proceedings of Advances in Neural Information Processing Systems 15 (NIPS 2003) (pdf)
Improving Embeddings by Flexible Exploitation of Side Information. A. Ghodsi, D. Wilkinson and F. Southey. In proceedings of The 20th International Joint Conference on Artificial Intelligence (IJCAI 2006) (pdf)
A Tutorial on Spectral Clustering. Ulrike von Luxburg1 (pdf)
|Sept 11 and Sept 13||Lecture 1 and 2||Motivation|
|Sep 18 and Sept 20||Lecture 3 and 4||Principal Components Analysis (PCA)|
|Sep 22||Lecture 5||PCA, Kernel function|
|Sep 25||Lecture 6||Dual PCA, Kernel PCA|
|Sep 27 and Sep 29||Lectures 7 and 8||Centering, Locally Linear Embedding (LLE) Slides (Examples are taken from this paper.)|
|Oct 4||Lecture 9||Locally Linear Embedding|
|Oct 6||Project Discussion|
|Oct 11 and 13||Lectures 10 and 11||Multidimensional Scaling (MDS), Isomap Slides|
|Oct 16||Lecture 12||Nystrom Approximation, Landmark MDS|
|Oct 18||Lecture 13||Landmark MDS|
|Oct 20, 23 and 25||Lectures 14, 15 and 16||Unified Framework, Semidefinite Embedding (SDE)|
|OCT 27||Lecture 17||Landmark SDE|
|Oct 30||Lecture 18||Action Respecting Embedding (ARE)|
|Nov 1||Lecture 19||Clustering|
|Nov 3 and 6||Lectures 20 and 21||Combinatorial Algorithms, K-means clustering|
|Nov 8 and 10||Lectures 22 and 23||Mixture Models|
|Nov 13 and Nov 15||Lectures 24 and 25||Learning a Metric (Class-Equivalence Side Information)|
|Nov 17||Lecture 26||Learning a Metric (Partial Distance Side Information)|
Assignment 1 Data for Assignment 1
Assignment 2 Data for Assignment 2 code
Assignment 3 Clarification
|November 20||Presentations will start|
|October 23||Proposal due|
|November 3||Take-home exam|
|December 20||Final project reports due|
Final project reports (up to 8 pages of PDF) are worth 25% of your final grade .You are encouraged to chose a topic related to your research area. However, you cannot borrow part of an existing thesis work, nor can you re-use a project from another course.
Due Date: Final project reports are due December 20 .Hand in your report to Joan Hatton at MC 6028 by 4:00 pm.
If you use ideas, plots, text and other intellectual property developed by someone else you have to cite the original source.
If you copy a sentence or a paragraph from work done by someone else, in addition to citing the original source you have to use quotation marks to identify the scope of the copied material.
Plagiarism is an act of “using ideas, plots, text and other intellectual property developed by someone else while claiming it is your original work.”1
1. Tec Encyclopedia. http://www.answers.com/topic/plagiarism
Evidence of copying or plagiarism will cause a failing mark in the course.
Please attach this cover page to your report.
I use this marking scheme to mark the projects.