SVM classifier for Multi Class Image Classification

3
Team Name: Tagme203 Team Member: Thumar Rushik, Sharma Chandresh Method summarizes: 1) Software Used:- MATLAB 2013a 2) Feature Extraction Process:- Given feature vector used for classification 3) Similarity/Distance Measure:- distance measure though the hyperplane. (hyperplane is separate two types of class, in this case hyperplane is linear) 4) Classifier:- Support Vector Machine (SVM) 5) Reference:- V. N. Vapnik,O. Chapelle, P.Haffner ”Support Vector Machines for Histogram-Based Image Classification ”,IEEE transaction on Neural Networks, Vol.10, No.5, Sept.1999. Algorithm: 1) Feature Extraction Step:- As a features use a given data. 2) Training Algorithm: I. Input format:- feature vectors, label II. Tunable parameter: boxconstraint, kernel function III. Output format=structure, which include all the parameter include to train SVM classifier In this work, we use the Multiclass SVM indirect method one against one for classification purpose, in this case N classes (N*(N-1))/2 classifiers are built, one for each pair of classes. For this event take N=5 and total 10 classifier built. Train the classifier use svmtrain function (http://www.mathworks.in/help/stats/svmtrain.html ) of MATLAB. Chose a optimal boundary between two class using boxconstraint and kernel function parameter. 3) Validation and Parameter Tuning:- boxconstrain parameter :-

description

SVM Classifier

Transcript of SVM classifier for Multi Class Image Classification

Team Name: Tagme203Team Member: Thumar Rushik, Sharma Chandresh

Method summarizes:1) Software Used:- MATLAB 2013a2) Feature Extraction Process:- Given feature vector used for classification3) Similarity/Distance Measure:- distance measure though the hyperplane. (hyperplane is separate two types of class, in this case hyperplane is linear)4) Classifier:- Support Vector Machine (SVM) 5) Reference:- V. N. Vapnik,O. Chapelle, P.Haffner Support Vector Machines for Histogram-Based Image Classification ,IEEE transaction on Neural Networks, Vol.10, No.5, Sept.1999. Algorithm:1) Feature Extraction Step:- As a features use a given data.2) Training Algorithm:I. Input format:- feature vectors, labelII. Tunable parameter: boxconstraint, kernel functionIII. Output format=structure, which include all the parameter include to train SVM classifier

In this work, we use the Multiclass SVM indirect method one against one for classification purpose, in this case N classes (N*(N-1))/2 classifiers are built, one for each pair of classes. For this event take N=5 and total 10 classifier built. Train the classifier use svmtrain function (http://www.mathworks.in/help/stats/svmtrain.html) of MATLAB. Chose a optimal boundary between two class using boxconstraint and kernel function parameter. 3) Validation and Parameter Tuning:- boxconstrain parameter :-In detail, Value of the box constraintCfor the soft margin.Ccan be a scalar, or a vector of the same length as the training data.IfCis a scalar, it is automatically rescaled byN/(2*N1)for the data points of group one and byN/(2*N2)for the data points of group two, whereN1is the number of elements in group one,N2is the number of elements in group two, and N=N1+N2. This rescaling is done to take into account unbalanced groups, that is cases whereN1andN2have very different values. IfCis an array, then each array element is taken as a box constraint for the data point with the same index.Default:1

Kernel function:Kernel functionsvmtrainuses to map the training data into kernel space. In case we used Linear kernel function, because most of literature gives linear function gives better accuracy.Default: kernel function is linear

4) Prediction Algorithm: In one v/s one classifier, each class compare to one another class. Also it based on the voting scheme. For prediction of label used svmclassify function (http://www.mathworks.in/help/stats/svmclassify.html) of MATLAB. After compare, find maximum probability of class and assign label. Figure 1 shows architecture of the prediction scheme for five classes considered earlier.

Figure 1 Prediction algorithm of one against one classifier