Statements in which the resource exists as a subject.
PredicateObject
rdf:type
lifeskim:mentions
pubmed:issue
1
pubmed:dateCreated
2006-3-10
pubmed:abstractText
In pattern recognition, feature extraction techniques are widely employed to reduce the dimensionality of data and to enhance the discriminatory information. Principal component analysis (PCA) and linear discriminant analysis (LDA) are the two most popular linear dimensionality reduction methods. However, PCA is not very effective for the extraction of the most discriminant features, and LDA is not stable due to the small sample size problem. In this paper, we propose some new (linear and nonlinear) feature extractors based on maximum margin criterion (MMC). Geometrically, feature extractors based on MMC maximize the (average) margin between classes after dimensionality reduction. It is shown that MMC can represent class separability better than PCA. As a connection to LDA, we may also derive LDA from MMC by incorporating some constraints. By using some other constraints, we establish a new linear feature extractor that does not suffer from the small sample size problem, which is known to cause serious stability problems for LDA. The kernelized (nonlinear) counterpart of this linear feature extractor is also established in the paper. Our extensive experiments demonstrate that the new feature extractors are effective, stable, and efficient.
pubmed:commentsCorrections
pubmed:language
eng
pubmed:journal
pubmed:citationSubset
IM
pubmed:status
MEDLINE
pubmed:month
Jan
pubmed:issn
1045-9227
pubmed:author
pubmed:issnType
Print
pubmed:volume
17
pubmed:owner
NLM
pubmed:authorsComplete
Y
pubmed:pagination
157-65
pubmed:dateRevised
2008-1-4
pubmed:meshHeading
pubmed:year
2006
pubmed:articleTitle
Efficient and robust feature extraction by maximum margin criterion.
pubmed:affiliation
Department of Computer Science and Engineering, University of California, Riverside, CA 92521, USA. hli@cs.ucr.edu
pubmed:publicationType
Journal Article, Research Support, U.S. Gov't, Non-P.H.S.