IJPAM: Volume 87, No. 6 (2013)
AND MINIMIZE THE VARIABLES USED FOR REGRESSION



Sathyabama University, Chennai, INDIA

VIT University
Chennai, INDIA
Abstract. Machine Learning is considered as a subfield of Artificial
Intelligence and it is concerned with the development of techniques
and methods which enable the computer to learn. In classification
problems generalization control is obtained by maximizing the margin,
which corresponds to minimization of the weight vector. The
minimization of the weight vector can be used in regression problems,
with a loss function. The problem of classification for linearly
separable data and introduces the concept of margin and the essence of
SVM - margin maximization. In this paper gives the soft margin SVM
introduces the idea of slack variables and the trade-off between
maximizing the margin and minimizing the number of misclassified
variables. A presentation of linear SVM followed by its extension to
nonlinear SVM and SVM regression is then provided to give the basic
mathematical details. SRM minimizes an upper bound on the expected
risk, where as ERM minimizes the error on the training data. It also
develops the concept of SVM technique can be used for regression. SVR
attempts to minimize the generalization error bound so as to achieve
generalized performance instead of minimizing the observed training
error.
Received: September 6, 2013
AMS Subject Classification: 62H30, 26A24, 90C51, 90C20
Key Words and Phrases: linear and non-linear classification, machine learning, SVM mathematical, SVM trade-off, SVM regression
Download paper from here.
DOI: 10.12732/ijpam.v87i6.2 How to cite this paper?
Source: International Journal of Pure and Applied Mathematics
ISSN printed version: 1311-8080
ISSN on-line version: 1314-3395
Year: 2013
Volume: 87
Issue: 6