Making indefinite kernel learning practical
Loading...
Date
2006-11-10T07:45:45Z
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In this paper we embed evolutionary computation into statistical
learning theory. First, we outline the connection between large margin optimization and statistical learning and see why this paradigm
is successful for many pattern recognition problems. We then embed
evolutionary computation into the most prominent representative of
this class of learning methods, namely into Support Vector Machines
(SVM). In contrast to former applications of evolutionary algorithms
to SVM we do not only optimize the method or kernel parameters.
We rather use evolution strategies in order to directly solve the posed
constrained optimization problem. Transforming the problem into the
Wolfe dual reduces the total runtime and allows the usage of kernel
functions just as for traditional SVM. We will show that evolutionary
SVM are at least as accurate as their quadratic programming counterparts on eight real-world benchmark data sets in terms of generalization performance. They always outperform traditional approaches
in terms of the original optimization problem. Additionally, the proposed algorithm is more generic than existing traditional solutions
since it will also work for non-positive semidefinite or indefinite kernel
functions. The evolutionary SVM variants frequently outperform their
quadratic programming competitors in cases where such an indefinite Kernel function is used.
Description
Table of contents
Keywords
Evolutionary computation, Kernel parameter, Pattern recognition, Statistical learning theory, Support vector machines