Exploring Structural Sparsity of Deep Networks Via Inverse Scale Spaces

The great success of deep neural networks is built upon their over-parameterization, which smooths the optimization landscape without degrading the generalization ability. Despite the benefits of over-parameterization, a huge amount of parameters makes deep networks cumbersome in daily life applicat...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 45(2023), 2 vom: 15. Feb., Seite 1749-1765
1. Verfasser: Fu, Yanwei (VerfasserIn)
Weitere Verfasser: Liu, Chen, Li, Donghao, Zhong, Zuyuan, Sun, Xinwei, Zeng, Jinshan, Yao, Yuan
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
LEADER 01000naa a22002652 4500
001 NLM33981568X
003 DE-627
005 20231226003735.0
007 cr uuu---uuuuu
008 231226s2023 xx |||||o 00| ||eng c
024 7 |a 10.1109/TPAMI.2022.3168881  |2 doi 
028 5 2 |a pubmed24n1132.xml 
035 |a (DE-627)NLM33981568X 
035 |a (NLM)35452384 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Fu, Yanwei  |e verfasserin  |4 aut 
245 1 0 |a Exploring Structural Sparsity of Deep Networks Via Inverse Scale Spaces 
264 1 |c 2023 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Completed 06.04.2023 
500 |a Date Revised 06.04.2023 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a The great success of deep neural networks is built upon their over-parameterization, which smooths the optimization landscape without degrading the generalization ability. Despite the benefits of over-parameterization, a huge amount of parameters makes deep networks cumbersome in daily life applications. On the other hand, training neural networks without over-parameterization faces many practical problems, e.g., being trapped in the local optimal. Though techniques such as pruning and distillation are developed, they are expensive in fully training a dense network as backward selection methods; and there is still a void on systematically exploring forward selection methods for learning structural sparsity in deep networks. To fill in this gap, this paper proposes a new approach based on differential inclusions of inverse scale spaces. Specifically, our method can generate a family of models from simple to complex ones along the dynamics via coupling a pair of parameters, such that over-parameterized deep models and their structural sparsity can be explored simultaneously. This kind of differential inclusion scheme has a simple discretization, dubbed Deep structure splitting Linearized Bregman Iteration (DessiLBI), whose global convergence in learning deep networks could be established under the Kurdyka-Łojasiewicz framework. Particularly, we explore several applications of DessiLBI, including finding sparse structures of networks directly via the coupled structure parameter and growing networks from simple to complex ones progressively. Experimental evidence shows that our method achieves comparable and even better performance than the competitive optimizers in exploring the sparse structure of several widely used backbones on the benchmark datasets. Remarkably, with early stopping, our method unveils "winning tickets" in early epochs: the effective sparse network structures with comparable test accuracy to fully trained over-parameterized models, that are further transferable to similar alternative tasks. Furthermore, our method is able to grow networks efficiently with adaptive filter configurations, demonstrating the good performance with much less computational cost. Codes and models can be downloaded at https://github.com/DessiLBI2020/DessiLBI 
650 4 |a Journal Article 
700 1 |a Liu, Chen  |e verfasserin  |4 aut 
700 1 |a Li, Donghao  |e verfasserin  |4 aut 
700 1 |a Zhong, Zuyuan  |e verfasserin  |4 aut 
700 1 |a Sun, Xinwei  |e verfasserin  |4 aut 
700 1 |a Zeng, Jinshan  |e verfasserin  |4 aut 
700 1 |a Yao, Yuan  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on pattern analysis and machine intelligence  |d 1979  |g 45(2023), 2 vom: 15. Feb., Seite 1749-1765  |w (DE-627)NLM098212257  |x 1939-3539  |7 nnns 
773 1 8 |g volume:45  |g year:2023  |g number:2  |g day:15  |g month:02  |g pages:1749-1765 
856 4 0 |u http://dx.doi.org/10.1109/TPAMI.2022.3168881  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 45  |j 2023  |e 2  |b 15  |c 02  |h 1749-1765