Kernel discriminant analysis for positive definite and indefinite kernels

Kernel methods are a class of well established and successful algorithms for pattern analysis thanks to their mathematical elegance and good performance. Numerous nonlinear extensions of pattern recognition techniques have been proposed so far based on the so-called kernel trick. The objective of th...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 31(2009), 6 vom: 12. Juni, Seite 1017-32
1. Verfasser: Pekalska, Elzbieta (VerfasserIn)
Weitere Verfasser: Haasdonk, Bernard
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2009
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article Research Support, Non-U.S. Gov't
LEADER 01000naa a22002652 4500
001 NLM18787378X
003 DE-627
005 20231223180917.0
007 cr uuu---uuuuu
008 231223s2009 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2008.290  |2 doi 
028 5 2 |a pubmed24n0626.xml 
035 |a (DE-627)NLM18787378X 
035 |a (NLM)19372607 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Pekalska, Elzbieta  |e verfasserin  |4 aut 
245 1 0 |a Kernel discriminant analysis for positive definite and indefinite kernels 
264 1 |c 2009 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 07.07.2009 
500 |a Date Revised 17.04.2009 
500 |a published: Print 
500 |a Citation Status MEDLINE 
520 |a Kernel methods are a class of well established and successful algorithms for pattern analysis thanks to their mathematical elegance and good performance. Numerous nonlinear extensions of pattern recognition techniques have been proposed so far based on the so-called kernel trick. The objective of this paper is twofold. First, we derive an additional kernel tool that is still missing, namely kernel quadratic discriminant (KQD). We discuss different formulations of KQD based on the regularized kernel Mahalanobis distance in both complete and class-related subspaces. Secondly, we propose suitable extensions of kernel linear and quadratic discriminants to indefinite kernels. We provide classifiers that are applicable to kernels defined by any symmetric similarity measure. This is important in practice because problem-suited proximity measures often violate the requirement of positive definiteness. As in the traditional case, KQD can be advantageous for data with unequal class spreads in the kernel-induced spaces, which cannot be well separated by a linear discriminant. We illustrate this on artificial and real data for both positive definite and indefinite kernels 
650 4 |a Journal Article 
650 4 |a Research Support, Non-U.S. Gov't 
700 1 |a Haasdonk, Bernard  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g 31(2009), 6 vom: 12. Juni, Seite 1017-32  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnns 
773 1 8 |g volume:31  |g year:2009  |g number:6  |g day:12  |g month:06  |g pages:1017-32 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2008.290  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 31  |j 2009  |e 6  |b 12  |c 06  |h 1017-32