Gentle Nearest Neighbors Boosting over Proper Scoring Rules

Tailoring nearest neighbors algorithms to boosting is an important problem. Recent papers study an approach, UNN, which provably minimizes particular convex surrogates under weak assumptions. However, numerical issues make it necessary to experimentally tweak parts of the UNN algorithm, at the possi...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 37(2015), 1 vom: 01. Jan., Seite 80-93
1. Verfasser: Nock, Richard (VerfasserIn)
Weitere Verfasser: Ali, Wafa Bel Haj, D'Ambrosio, Roberto, Nielsen, Frank, Barlaud, Michel
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2015
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM252590864
003 DE-627
005 20231224164432.0
007 cr uuu---uuuuu
008 231224s2015 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2014.2307877  |2 doi 
028 5 2 |a pubmed24n0842.xml 
035 |a (DE-627)NLM252590864 
035 |a (NLM)26353210 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Nock, Richard  |e verfasserin  |4 aut 
245 1 0 |a Gentle Nearest Neighbors Boosting over Proper Scoring Rules 
264 1 |c 2015 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 24.11.2015 
500 |a Date Revised 10.09.2015 
500 |a published: Print 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a Tailoring nearest neighbors algorithms to boosting is an important problem. Recent papers study an approach, UNN, which provably minimizes particular convex surrogates under weak assumptions. However, numerical issues make it necessary to experimentally tweak parts of the UNN algorithm, at the possible expense of the algorithm's convergence and performance. In this paper, we propose a lightweight Newton-Raphson alternative optimizing proper scoring rules from a very broad set, and establish formal convergence rates under the boosting framework that compete with those known for UNN. To the best of our knowledge, no such boosting-compliant convergence rates were previously known in the popular Gentle Adaboost's lineage. We provide experiments on a dozen domains, including Caltech and SUN computer vision databases, comparing our approach to major families including support vector machines, (Ada)boosting and stochastic gradient descent. They support three major conclusions: (i) GNNB significantly outperforms UNN, in terms of convergence rate and quality of the outputs, (ii) GNNB performs on par with or better than computationally intensive large margin approaches, (iii) on large domains that rule out those latter approaches for computational reasons, GNNB provides a simple and competitive contender to stochastic gradient descent. Experiments include a divide-and-conquer improvement of GNNB exploiting the link with proper scoring rules optimization 
650 4 |a Journal Article 
700 1 |a Ali, Wafa Bel Haj  |e verfasserin  |4 aut 
700 1 |a D'Ambrosio, Roberto  |e verfasserin  |4 aut 
700 1 |a Nielsen, Frank  |e verfasserin  |4 aut 
700 1 |a Barlaud, Michel  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g 37(2015), 1 vom: 01. Jan., Seite 80-93  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnns 
773 1 8 |g volume:37  |g year:2015  |g number:1  |g day:01  |g month:01  |g pages:80-93 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2014.2307877  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 37  |j 2015  |e 1  |b 01  |c 01  |h 80-93