Consistency and robustness of kernel based regression

dc.contributor.authorChristmann, Andreasde
dc.contributor.authorSteinwart, Ingode
dc.date.accessioned2005-01-31T08:15:41Z
dc.date.available2005-01-31T08:15:41Z
dc.date.issued2005de
dc.description.abstractWe investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y . We then establish a consistency result for KBR and give assumptions for the existence of the influence function. In particular, our results allow to choose the loss function and the kernel to obtain computational tractable and consistent KBR methods having bounded influence functions. Furthermore, bounds for the sensitivity curve which is a finite sample version of the influence function are developed, and some numerical experiments are discussed.de
dc.format.extent381952 bytes
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/2003/20093
dc.identifier.urihttp://dx.doi.org/10.17877/DE290R-15685
dc.language.isoende
dc.subject.ddc310de
dc.titleConsistency and robustness of kernel based regressionen
dc.typeTextde
dc.type.publicationtypereporten
dcterms.accessRightsopen access

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
tr01-05.pdf
Size:
373 KB
Format:
Adobe Portable Document Format
Description:
DNB