Chapter_9 - Tech. and More

spraytownspeakerAI and Robotics

Oct 16, 2013 (4 years and 24 days ago)

70 views

Chapter 9
:
Support Vector Machine



Dr. Essam Al D
a
o
ud

1

Chapter

9

Support Vector Machine


The SVM classification function F(x) takes the form


w is the weight vector and b is the bias, which will be computed by SVM in the

training process.
The l
inear classifiers (hyperplane) in two
-
dimensional spaces

can be
rep
resented in the following figure:





Hard
-
margin SVM Classification


T
he hyperplane maximizing the margin in a two
-
dimensional space



Thus, SVM problem can be written as following:




Chapter 9
:
Support Vector Machine



Dr. Essam Al D
a
o
ud

2

the primal problem deals with a convex cost function and linear

con
straints. Given
such a constrained optimization problem, it is possible to construct

another problem
called dual problem.





Having determined the optim
um Lagrange multipliers
, we may

co
mpute the optimum
weight vectors:







Soft
-
margin SVM Classifica
tion


S
oft margin SVM allows mislabeled data points while still

maximizing the m
argin.
The method introduces slack variables
,
which measure the degree of
misclassification. The following is the optimization problem for soft

margin SVM.





Kernel Trick f
or Nonlinear Classification


If the training data is not linearly separable, there is no straight hyperplane that can

separate the classes. In order to learn a nonlinear function in that case, linear SVMs

must be extended to nonlinear SVMs for the classifi
cation of nonlinearly separable
.
The dual problem is now defined using the kernel function as follows


Chapter 9
:
Support Vector Machine



Dr. Essam Al D
a
o
ud

3







Kernel functions must be continuous, symmetric, and most preferably should have a
positive (semi
-
) definite

Gram matrix ,meaning their kernel mat
rices have no non
-
negative Eigen values. The use of a positive definite kernel insures that the
optimization problem will be convex and solution will be unique.


The followings are popularly used kernel functions.


1. Linear Kernel


2. Polynomial Kernel





3. Gaussian Kern
el


4. Exponential Kernel


Chapter 9
:
Support Vector Machine



Dr. Essam Al D
a
o
ud

4


SVM steps

1
-

Construct the Gram matrix by using a kernel function

2
-

Solve the Maximization problem ( famous algorithm is
Sequentia
l Minimal
Optimization

SMO)

3
-

Find the support vectors

4
-

find b as following



Where
(x
s
,y
s
)

is any one of the support vectors


5
-

Classify the test vector z according to the





Example


Classify (
-
1,0) and (1,0) by SVM
and polynomial kernel


solution









Chapter 9
:
Support Vector Machine



Dr. Essam Al D
a
o
ud

5


The maximization problem of polynomial kernel is



Let the solution of above problem
be:




Then the support vectors are 1 and 2
, and b

























Chapter 9
:
Support Vector Machine



Dr. Essam Al D
a
o
ud

6


Matlab








load
fisheriris

data = [meas(:
,1), meas(:,2)];

groups = ismember(species,
'setosa'
);



[train, test] = crossvalind(
'holdOut'
,groups);

cp = classperf(groups);





svmStruct =
svmtrain(data(train,:),groups(train),
'KERNEL_FUNCTION'
,
'linear'
,
'showplot'
,true);




classes = svmclassify(svmStr
uct,data(test,:),
'showplot'
,true);



classperf(cp,classes,test);

cp.CorrectRate



figure

svmStruct = svmtrain(data(train,:),groups(train),
'showplot'
,true,
'boxconstraint'
,1e6);



classes = svmclassify(svmStruct,data(test,:),
'showplot'
,true);



classperf(cp,
classes,test);

cp.CorrectRate