Fast Learning Rates for Plug-In Classifiers

It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than $n^{-1/2}$. The work on this subject has suggested the following two conjectures: (i) the best achievable fast...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:The Annals of Statistics. - Institute of Mathematical Statistics. - 35(2007), 2, Seite 608-633
1. Verfasser: Audibert, Jean-Yves (VerfasserIn)
Weitere Verfasser: Tsybakov, Alexandre B.
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2007
Zugriff auf das übergeordnete Werk:The Annals of Statistics
Schlagworte:Classification Statistical learning Fast rates of convergence Excess risk Plug-in classifiers Minimax lower bounds Mathematics Behavioral sciences Physical sciences Applied sciences
Beschreibung
Zusammenfassung:It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than $n^{-1/2}$. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n⁻¹, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n⁻¹. We establish minimax lower bounds showing that the obtained rates cannot be improved.
ISSN:00905364