|
|
|
|
LEADER |
01000caa a22002652c 4500 |
001 |
NLM334114810 |
003 |
DE-627 |
005 |
20250302183227.0 |
007 |
cr uuu---uuuuu |
008 |
231225s2022 xx |||||o 00| ||eng c |
024 |
7 |
|
|a 10.1109/TPAMI.2021.3133351
|2 doi
|
028 |
5 |
2 |
|a pubmed25n1113.xml
|
035 |
|
|
|a (DE-627)NLM334114810
|
035 |
|
|
|a (NLM)34874851
|
040 |
|
|
|a DE-627
|b ger
|c DE-627
|e rakwb
|
041 |
|
|
|a eng
|
100 |
1 |
|
|a Wang, Jingyu
|e verfasserin
|4 aut
|
245 |
1 |
0 |
|a Ratio Sum Versus Sum Ratio for Linear Discriminant Analysis
|
264 |
|
1 |
|c 2022
|
336 |
|
|
|a Text
|b txt
|2 rdacontent
|
337 |
|
|
|a ƒaComputermedien
|b c
|2 rdamedia
|
338 |
|
|
|a ƒa Online-Ressource
|b cr
|2 rdacarrier
|
500 |
|
|
|a Date Revised 08.11.2022
|
500 |
|
|
|a published: Print-Electronic
|
500 |
|
|
|a Citation Status PubMed-not-MEDLINE
|
520 |
|
|
|a Dimension reduction is a critical technology for high-dimensional data processing, where Linear Discriminant Analysis (LDA) and its variants are effective supervised methods. However, LDA prefers to feature with smaller variance, which causes feature with weak discriminative ability retained. In this paper, we propose a novel Ratio Sum for Linear Discriminant Analysis (RSLDA), which aims at maximizing discriminative ability of each feature in subspace. To be specific, it maximizes the sum of ratio of the between-class distance to the within-class distance in each dimension of subspace. Since the original RSLDA problem is difficult to obtain the closed solution, an equivalent problem is developed which can be solved by an alternative optimization algorithm. For solving the equivalent problem, it is transformed into two sub-problems, one of which can be solved directly, the other is changed into a convex optimization problem, where singular value decomposition is employed instead of matrix inversion. Consequently, performance of algorithm cannot be affected by the non-singularity of covariance matrix. Furthermore, Kernel RSLDA (KRSLDA) is presented to improve the robustness of RSLDA. Additionally, time complexity of RSLDA and KRSLDA are analyzed. Extensive experiments show that RSLDA and KRSLDA outperforms other comparison methods on toy datasets and multiple public datasets
|
650 |
|
4 |
|a Journal Article
|
700 |
1 |
|
|a Wang, Hongmei
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Nie, Feiping
|e verfasserin
|4 aut
|
700 |
1 |
|
|a Li, Xuelong
|e verfasserin
|4 aut
|
773 |
0 |
8 |
|i Enthalten in
|t IEEE transactions on pattern analysis and machine intelligence
|d 1979
|g 44(2022), 12 vom: 15. Dez., Seite 10171-10185
|w (DE-627)NLM098212257
|x 1939-3539
|7 nnas
|
773 |
1 |
8 |
|g volume:44
|g year:2022
|g number:12
|g day:15
|g month:12
|g pages:10171-10185
|
856 |
4 |
0 |
|u http://dx.doi.org/10.1109/TPAMI.2021.3133351
|3 Volltext
|
912 |
|
|
|a GBV_USEFLAG_A
|
912 |
|
|
|a SYSFLAG_A
|
912 |
|
|
|a GBV_NLM
|
912 |
|
|
|a GBV_ILN_350
|
951 |
|
|
|a AR
|
952 |
|
|
|d 44
|j 2022
|e 12
|b 15
|c 12
|h 10171-10185
|