Improved Random Forest for Classification
We propose an improved random forest classifier that performs classification with minimum number of trees. The proposed method iteratively removes some unimportant features. Based on the number of important and unimportant features, we formulate a novel theoretical upper limit on the number of trees...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 27(2018), 8 vom: 09. Aug., Seite 4012-4024 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2018
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | We propose an improved random forest classifier that performs classification with minimum number of trees. The proposed method iteratively removes some unimportant features. Based on the number of important and unimportant features, we formulate a novel theoretical upper limit on the number of trees to be added to the forest to ensure improvement in classification accuracy. Our algorithm converges with a reduced but important set of features. We prove that further addition of trees or further reduction of features does not improve classification performance. The efficacy of the proposed approach is demonstrated through experiments on benchmark datasets. We further use the proposed classifier to detect mitotic nuclei in the histopathological datasets of breast tissues. We also apply our method on the industrial dataset of dual phase steel microstructures to classify different phases. Results of our method on different datasets show significant reduction in average classification error compared to a number of competing methods |
---|---|
Beschreibung: | Date Revised 20.11.2019 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2018.2834830 |