Optimal vs. Classical Linear Dimension Reduction

Loading...
Thumbnail Image

Date

1998

Journal Title

Journal ISSN

Volume Title

Publisher

Universitätsbibliothek Dortmund

Abstract

We describe a computer intensive method for linear dimension reduction which minimizes the classification error directly. Simulated annealing (Bohachevsky et al. (1986)) is used to solve this problem. The classification error is determined by an exact integration. We avoid distance or scatter measures which are only surrogates to circumvent the classification error. Simulations (in two dimensions) and analytical approximations demonstrate the superiority of optimal classification opposite to the classical procedures. We compare our procedure to the well-known canonical discriminant analysis (homoscedastic case) as described in Mc Lachlan (1992) and to a method by Young et al. (1987) for the heteroscedastic case. Special emphasis is put on the case when the distance based methods collapse. The computer intensive algorithm always achieves minimal classification error.

Description

Table of contents

Keywords

Citation