Autor(en): Mierswa, Ingo
Titel: Making indefinite kernel learning practical
Sprache (ISO): en
Zusammenfassung: In this paper we embed evolutionary computation into statistical learning theory. First, we outline the connection between large margin optimization and statistical learning and see why this paradigm is successful for many pattern recognition problems. We then embed evolutionary computation into the most prominent representative of this class of learning methods, namely into Support Vector Machines (SVM). In contrast to former applications of evolutionary algorithms to SVM we do not only optimize the method or kernel parameters. We rather use evolution strategies in order to directly solve the posed constrained optimization problem. Transforming the problem into the Wolfe dual reduces the total runtime and allows the usage of kernel functions just as for traditional SVM. We will show that evolutionary SVM are at least as accurate as their quadratic programming counterparts on eight real-world benchmark data sets in terms of generalization performance. They always outperform traditional approaches in terms of the original optimization problem. Additionally, the proposed algorithm is more generic than existing traditional solutions since it will also work for non-positive semidefinite or indefinite kernel functions. The evolutionary SVM variants frequently outperform their quadratic programming competitors in cases where such an indefinite Kernel function is used.
Schlagwörter: Evolutionary computation
Kernel parameter
Pattern recognition
Statistical learning theory
Support vector machines
URI: http://hdl.handle.net/2003/23076
http://dx.doi.org/10.17877/DE290R-1946
Erscheinungsdatum: 2006-11-10T07:45:45Z
Enthalten in den Sammlungen:Sonderforschungsbereich (SFB) 475

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
tr41-06.pdfDNB228.03 kBAdobe PDFÖffnen/Anzeigen


Diese Ressource ist urheberrechtlich geschützt.



Diese Ressource ist urheberrechtlich geschützt. rightsstatements.org