Class date(s):
|
04 February 2015
|
|
|
Download video: Link [433 M]
|
|
|
|
|
|
Download video: Link [797 M]
|
|
|
|
|
|
Download video: Link [820 M]
|
|
|
|
|
|
Download video: Link [640 M]
|
|
|
|
|
|
Download video: Link [923 M]
|
|
|
|
|
|
Download video: Link [943 M]
|
|
|
|
|
|
Download video: Link [667 M]
|
|
|
|
|
|
Download video: Link [948 M]
|
|
|
|
|
|
Download video: Link [935 M]
|
|
|
|
Resources
Scroll down, if necessary, to see the resources.
Date
|
Class number
|
Topic
|
Slides/handouts for class
|
Video file
|
References and Notes
|
04 February
|
05A
|
- Why consider unconstrained, single-variable problems
- Newton's method review to solve these problems
|
Handout from class
|
Video
|
|
09 February
|
06A
|
- Newton's method reviewed again for unconstrained, single-variable problems
- Using finite differences instead in Newton's method
- Multivariate unconstrained optimization
|
Handout from class
|
Video
|
|
11 February
|
06B
|
- Unconstrained single-variable optimization using gradient search
- Unconstrained multivariate optimization using gradient search
- Understanding the line search problem
|
Handout from class
|
Video
|
|
16 to 27 February
|
07
|
Reading week break and midterm
|
02 March
|
08A
|
- Unconstrained optimization in two variables review
- Contrasting it back to the single variable case
- Extending to the multidimensional Newton's method
|
Handout from class
|
Video
|
|
04 March
|
08B
|
- Examples on the multidimensional Newton's method
- Quasi Newton method in multiple dimensions
- Positive and negative definiteness of the Hessian
|
Handout from class
|
Video
|
Code used in class (see below)
|
09 March
|
09A
|
- Constrained nonlinear optimization introduction
- Model formulation (convert a problem to mathematics)
|
Handout from class
|
Video
|
|
11 March
|
09B
|
Guest lecture
|
Handout from class
|
Video
|
|
16 March
|
10A
|
- Convexity, concavity
- Guarantees on when problems are globally optimal
|
Handout from class (continued with handout 09A)
|
Video
|
|
18 March
|
10B
|
- Lagrange multiplier method for constrained optimization
- Interpretation of the Lagrange multiplier constraints
|
Handout from class
|
Video
|
|
23 March
|
11A
|
- The Nelder-Mead method (several of you are using it in your projects)
- Practice with using the Nelder Mead method. Optimize this system: http://yint.org/nm
|
Handout from class
|
Video
|
|
Taking full Newton's steps to solve the class example
clear all;
close all;
clc;
[X1,X2] = meshgrid(-0.5:0.1:6, 0:0.01:9);
Z = func(X1,X2);
contour(X1, X2, Z)
hold on
grid on
x = [1,3]';
plot(x(1), x(2), 'o')
text(x(1)+0.2, x(2), '0')
for k = 1:10
slope = -first_deriv(x)
step = hessian(x)\slope; % Solves the Ax=b problem, as x = A\b
x = x + step;
plot(x(1), x(2), '*')
text(x(1)+0.1, x(2), num2str(k))
end
func.m
function y = func(x1,x2)
y = 4.*x1.*x2 - 5.*(x1-2).^4 - 3.*(x2-5).^4;
first_deriv.m
function y = first_deriv(x)
y = [4*x(2) - 20*(x(1)-2)^3;
4*x(1) - 12*(x(2)-5)^3];
hessian.m
function y = hessian(x)
y = [-60*(x(1)-2)^2, 4;
4, -36*(x(2)-5)^2];