On weighting clustering

Recent papers and patents in iterative unsupervised learning have emphasized a new trend in clustering. It basically consists of penalizing solutions via weights on the instance points, somehow making clustering move toward the hardest points to cluster. The motivations come principally from an anal...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1998. - 28(2006), 8 vom: 15. Aug., Seite 1223-35
1. Verfasser: Nock, Richard (VerfasserIn)
Weitere Verfasser: Nielsen, Frank
Format: Aufsatz
Sprache:English
Veröffentlicht: 2006
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Comparative Study Journal Article
LEADER 01000caa a22002652 4500
001 NLM164568301
003 DE-627
005 20250207124410.0
007 tu
008 231223s2006 xx ||||| 00| ||eng c
028 5 2 |a pubmed25n0549.xml 
035 |a (DE-627)NLM164568301 
035 |a (NLM)16886859 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Nock, Richard  |e verfasserin  |4 aut 
245 1 0 |a On weighting clustering 
264 1 |c 2006 
336 |a Text  |b txt  |2 rdacontent 
337 |a ohne Hilfsmittel zu benutzen  |b n  |2 rdamedia 
338 |a Band  |b nc  |2 rdacarrier 
500 |a Date Completed 05.09.2006 
500 |a Date Revised 15.11.2006 
500 |a published: Print 
500 |a Citation Status MEDLINE 
520 |a Recent papers and patents in iterative unsupervised learning have emphasized a new trend in clustering. It basically consists of penalizing solutions via weights on the instance points, somehow making clustering move toward the hardest points to cluster. The motivations come principally from an analogy with powerful supervised classification methods known as boosting algorithms. However, interest in this analogy has so far been mainly borne out from experimental studies only. This paper is, to the best of our knowledge, the first attempt at its formalization. More precisely, we handle clustering as a constrained minimization of a Bregman divergence. Weight modifications rely on the local variations of the expected complete log-likelihoods. Theoretical results show benefits resembling those of boosting algorithms and bring modified (weighted) versions of clustering algorithms such as k-means, fuzzy c-means, Expectation Maximization (EM), and k-harmonic means. Experiments are provided for all these algorithms, with a readily available code. They display the advantages that subtle data reweighting may bring to clustering 
650 4 |a Comparative Study 
650 4 |a Journal Article 
700 1 |a Nielsen, Frank  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1998  |g 28(2006), 8 vom: 15. Aug., Seite 1223-35  |w (DE-627)NLM098212257  |x 0162-8828  |7 nnns 
773 1 8 |g volume:28  |g year:2006  |g number:8  |g day:15  |g month:08  |g pages:1223-35 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 28  |j 2006  |e 8  |b 15  |c 08  |h 1223-35