Difference between revisions of "Principal Component Analysis"
Jump to navigation
Jump to search
Kevin Dunn (talk | contribs) m (→Class notes) |
Kevin Dunn (talk | contribs) |
||
Line 16: | Line 16: | ||
== Class preparation == | == Class preparation == | ||
=== Class 2 (16 September) === | |||
* [http://literature.connectmv.com/item/13/principal-component-analysis Reading for class 2] | * [http://literature.connectmv.com/item/13/principal-component-analysis Reading for class 2] | ||
* Linear algebra topics you should be familiar with before class 2 | * Linear algebra topics you should be familiar with before class 2: | ||
** matrix multiplication | ** matrix multiplication | ||
** that matrix multiplication of a vector by a matrix is a transformation from one coordinate system to another (we will review this in class) | ** that matrix multiplication of a vector by a matrix is a transformation from one coordinate system to another (we will review this in class) | ||
** [http://en.wikipedia.org/wiki/Linear_combination linear combinations] (read the first section of that website: we will review this in class) | ** [http://en.wikipedia.org/wiki/Linear_combination linear combinations] (read the first section of that website: we will review this in class) | ||
** the dot product of 2 vectors, and that they are related by the cosine of the angle between them (see the [http://en.wikipedia.org/wiki/Dot_product geometric interpretation section]) | ** the dot product of 2 vectors, and that they are related by the cosine of the angle between them (see the [http://en.wikipedia.org/wiki/Dot_product geometric interpretation section]) | ||
* | |||
** how an optimization problem is written | === Class 3 (23 September) === | ||
* [http://stats4eng.connectmv.com/wiki/Least_squares_modelling Least squares]: | |||
** what is the objective function of least squares | |||
** how to calculate the two regression coefficients \(b_0\) and \(b_1\) for \(y = b_0 + b_1x + e\) | |||
** understand that the residuals in least squares are orthogonal to \(x\) | |||
* Some optimization theory: | |||
** how an optimization problem is written with equality constraints | |||
** the [http://en.wikipedia.org/wiki/Lagrange_multiplier Lagrange multiplier principle] for solving simple, equality constrained optimization problems | ** the [http://en.wikipedia.org/wiki/Lagrange_multiplier Lagrange multiplier principle] for solving simple, equality constrained optimization problems |
Revision as of 00:05, 17 September 2011
Class notes
<pdfreflow> class_date = 16 September 2011 [1.65 Mb] button_label = Create my projector slides! show_page_layout = 1 show_frame_option = 1 pdf_file = lvm-class-2.pdf </pdfreflow>
- Also download these 3 CSV files and bring them on your computer:
- Peas dataset: http://datasets.connectmv.com/info/peas
- Food texture dataset: http://datasets.connectmv.com/info/food-texture
- Food consumption dataset: http://datasets.connectmv.com/info/food-consumption
Class preparation
Class 2 (16 September)
- Reading for class 2
- Linear algebra topics you should be familiar with before class 2:
- matrix multiplication
- that matrix multiplication of a vector by a matrix is a transformation from one coordinate system to another (we will review this in class)
- linear combinations (read the first section of that website: we will review this in class)
- the dot product of 2 vectors, and that they are related by the cosine of the angle between them (see the geometric interpretation section)
Class 3 (23 September)
- Least squares:
- what is the objective function of least squares
- how to calculate the two regression coefficients \(b_0\) and \(b_1\) for \(y = b_0 + b_1x + e\)
- understand that the residuals in least squares are orthogonal to \(x\)
- Some optimization theory:
- how an optimization problem is written with equality constraints
- the Lagrange multiplier principle for solving simple, equality constrained optimization problems