Matlab Exercise for Polynomial Approximation and Genetic Search

cottonseedbotanistΤεχνίτη Νοημοσύνη και Ρομποτική

24 Οκτ 2013 (πριν από 3 χρόνια και 11 μήνες)

77 εμφανίσεις

Matlab Exercise for Polynomial Approximation and Genetic Search

[For You Of Little Faith]…



1.

Generate a random variable in matlab or gauss, such as x = randn(100,1);

Then generate a crazy nonlinear function, such as y = sin(x) .^2

Try to approximate this
function with a linear approximation (include a constant).

Then try a polynomial approximation of the third or fourth order, ie do a regression of
y on a constant, x, and second and third and fourth powers of x.


Now try using Tchebeycheff approximations a
nd Hermite approximations. You can
create expansions with the following matlab programs for these two methods:

============================================

function txx = chebjudd(x, degree);

% Output: matrix of Chebchev polynomial expansion

% Input: x
, degree of expansion

[rr, cx] = size(x);

txx(:,1) = ones(rr,1);

txx(:,2) = x;

for i = 3:degree,


txx(:,i) = 2 .* x .* txx(:,i
-
1)
-

txx(:,i
-
2);

end

txx = txx(:,2:end);



=============================================================

function hhxx = hermi
tejudd(x, degree);

[rx, cx] = size(x);

hhxx(:,1) = ones(rx,1);

hhxx(:,2) = 2*x;

for i = 3:degree,


hhxx(:,i) = 2 .* x .* hhxx(:,i
-
1)
-

2 .* (i
-
1) .* hhxx(:,i
-
2);

end

hhxx = hhxx(:,2:end);


===========================================================

You
can paste these into matlab m files as subfunctions. Generate two new sets of
regressors, one as Tchebeycheff expansions of x and another as Hermite expansions.

Then do a regression of y on these regressors. Compare the fits (the r
-
squared
statistics).
Oh you of little faith!


2.

Simple Genetic algorithm Optimization (and the Quasi
-
Newton BFGS
Optimization Routine).


Create a nonlinear function of two variables with an easy analytical solution (so we
can compare accuracy). Use this function:

=============
==============================

function y = ff(x);

y = .5 * x(1)^2
-

3 * x(1) + .5 * x(2)^2
-

x(2);

y = y;


We know, of course, that the solution for optimizing y is x(1) = 3 and x(2) = 1.

But we want to see how these two routines get there, how fast and
how accurately.



1. First create an initial guess, say, x0 = randn(1,2); (this is a random guess).


Then invoke the genetic algorithm:



Use these commands: maxgen = 30; popsize = 100;

Then use this command: x = gen8f(‘ff’, popsize, maxgen, x0)
;

See what happens. The genetic algorithm gen8f.m is on the web page. So are
chebjudd.m, hermitejudd.m, and ff.m. You should be able to program the ols
regressions.


Finally do a quasi
-
Newton BFGS optimization:


Program this: xx = fminunc(‘ff’, x0);


C
ompare the quasi
-
Newton BFGS solution and the ga solution. Pretty good, heyna?