Feature selection based on mutual information : criteria of max-dependency, max-relevance, and min-redundancy

Feature selection is an important problem for pattern classification systems. We study how to select good features according to the maximal statistical dependency criterion based on mutual information. Because of the difficulty in directly implementing the maximal dependency condition, we first deri...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1998. - 27(2005), 8 vom: 07. Aug., Seite 1226-38
1. Verfasser: Peng, Hanchuan (VerfasserIn)
Weitere Verfasser: Long, Fuhui, Ding, Chris
Format: Aufsatz
Sprache:English
Veröffentlicht: 2005
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Comparative Study Evaluation Study Journal Article
LEADER 01000caa a22002652 4500
001 NLM157373975
003 DE-627
005 20250206154103.0
007 tu
008 231223s2005 xx ||||| 00| ||eng c
028 5 2 |a pubmed25n0525.xml 
035 |a (DE-627)NLM157373975 
035 |a (NLM)16119262 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Peng, Hanchuan  |e verfasserin  |4 aut 
245 1 0 |a Feature selection based on mutual information  |b criteria of max-dependency, max-relevance, and min-redundancy 
264 1 |c 2005 
336 |a Text  |b txt  |2 rdacontent 
337 |a ohne Hilfsmittel zu benutzen  |b n  |2 rdamedia 
338 |a Band  |b nc  |2 rdacarrier 
500 |a Date Completed 16.09.2005 
500 |a Date Revised 08.04.2022 
500 |a published: Print 
500 |a Citation Status MEDLINE 
520 |a Feature selection is an important problem for pattern classification systems. We study how to select good features according to the maximal statistical dependency criterion based on mutual information. Because of the difficulty in directly implementing the maximal dependency condition, we first derive an equivalent form, called minimal-redundancy-maximal-relevance criterion (mRMR), for first-order incremental feature selection. Then, we present a two-stage feature selection algorithm by combining mRMR and other more sophisticated feature selectors (e.g., wrappers). This allows us to select a compact set of superior features at very low cost. We perform extensive experimental comparison of our algorithm and other methods using three different classifiers (naive Bayes, support vector machine, and linear discriminate analysis) and four different data sets (handwritten digits, arrhythmia, NCI cancer cell lines, and lymphoma tissues). The results confirm that mRMR leads to promising improvement on feature selection and classification accuracy 
650 4 |a Comparative Study 
650 4 |a Evaluation Study 
650 4 |a Journal Article 
700 1 |a Long, Fuhui  |e verfasserin  |4 aut 
700 1 |a Ding, Chris  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1998  |g 27(2005), 8 vom: 07. Aug., Seite 1226-38  |w (DE-627)NLM098212257  |x 0162-8828  |7 nnns 
773 1 8 |g volume:27  |g year:2005  |g number:8  |g day:07  |g month:08  |g pages:1226-38 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 27  |j 2005  |e 8  |b 07  |c 08  |h 1226-38