Full metadata record
DC FieldValueLanguage
dc.contributor.authorRöhl, Michael C.de
dc.contributor.authorWeihs, Clausde
dc.date.accessioned2004-12-06T18:38:33Z-
dc.date.available2004-12-06T18:38:33Z-
dc.date.issued1998de
dc.identifier.urihttp://hdl.handle.net/2003/4854-
dc.identifier.urihttp://dx.doi.org/10.17877/DE290R-5423-
dc.description.abstractWe describe a computer intensive method for linear dimension reduction which minimizes the classification error directly. Simulated annealing (Bohachevsky et al. (1986)) is used to solve this problem. The classification error is determined by an exact integration. We avoid distance or scatter measures which are only surrogates to circumvent the classification error. Simulations (in two dimensions) and analytical approximations demonstrate the superiority of optimal classification opposite to the classical procedures. We compare our procedure to the well-known canonical discriminant analysis (homoscedastic case) as described in Mc Lachlan (1992) and to a method by Young et al. (1987) for the heteroscedastic case. Special emphasis is put on the case when the distance based methods collapse. The computer intensive algorithm always achieves minimal classification error.en
dc.format.extent248607 bytes-
dc.format.extent308298 bytes-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/postscript-
dc.language.isoende
dc.publisherUniversitätsbibliothek Dortmundde
dc.subject.ddc310de
dc.titleOptimal vs. Classical Linear Dimension Reductionen
dc.typeTextde
dc.type.publicationtypereporten
dcterms.accessRightsopen access-
Appears in Collections:Sonderforschungsbereich (SFB) 475

Files in This Item:
File Description SizeFormat 
98_12.pdfDNB242.78 kBAdobe PDFView/Open
tr12-98.ps301.07 kBPostscriptView/Open


This item is protected by original copyright



This item is protected by original copyright rightsstatements.org