Consistency and robustness of kernel based regression
dc.contributor.author | Christmann, Andreas | de |
dc.contributor.author | Steinwart, Ingo | de |
dc.date.accessioned | 2005-01-31T08:15:41Z | |
dc.date.available | 2005-01-31T08:15:41Z | |
dc.date.issued | 2005 | de |
dc.description.abstract | We investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y . We then establish a consistency result for KBR and give assumptions for the existence of the influence function. In particular, our results allow to choose the loss function and the kernel to obtain computational tractable and consistent KBR methods having bounded influence functions. Furthermore, bounds for the sensitivity curve which is a finite sample version of the influence function are developed, and some numerical experiments are discussed. | de |
dc.format.extent | 381952 bytes | |
dc.format.mimetype | application/pdf | |
dc.identifier.uri | http://hdl.handle.net/2003/20093 | |
dc.identifier.uri | http://dx.doi.org/10.17877/DE290R-15685 | |
dc.language.iso | en | de |
dc.subject.ddc | 310 | de |
dc.title | Consistency and robustness of kernel based regression | en |
dc.type | Text | de |
dc.type.publicationtype | report | en |
dcterms.accessRights | open access |
Files
Original bundle
1 - 1 of 1