|
|
|
|
| LEADER |
01000naa a22002652c 4500 |
| 001 |
NLM39346153X |
| 003 |
DE-627 |
| 005 |
20251002232606.0 |
| 007 |
cr uuu---uuuuu |
| 008 |
251002s2025 xx |||||o 00| ||eng c |
| 024 |
7 |
|
|a 10.1109/TPAMI.2025.3616249
|2 doi
|
| 028 |
5 |
2 |
|a pubmed25n1587.xml
|
| 035 |
|
|
|a (DE-627)NLM39346153X
|
| 035 |
|
|
|a (NLM)41032540
|
| 040 |
|
|
|a DE-627
|b ger
|c DE-627
|e rakwb
|
| 041 |
|
|
|a eng
|
| 100 |
1 |
|
|a Ye, Peng
|e verfasserin
|4 aut
|
| 245 |
1 |
0 |
|a $\beta$-DARTS++
|b Bi-level Regularization for Proxy-robust Differentiable Architecture Search
|
| 264 |
|
1 |
|c 2025
|
| 336 |
|
|
|a Text
|b txt
|2 rdacontent
|
| 337 |
|
|
|a ƒaComputermedien
|b c
|2 rdamedia
|
| 338 |
|
|
|a ƒa Online-Ressource
|b cr
|2 rdacarrier
|
| 500 |
|
|
|a Date Revised 01.10.2025
|
| 500 |
|
|
|a published: Print-Electronic
|
| 500 |
|
|
|a Citation Status Publisher
|
| 520 |
|
|
|a Neural Architecture Search (NAS) has attracted increasing attention in recent years because of its capability to design neural networks automatically. Among them, differential NAS approaches such as DARTS, have gained popularity for search efficiency. However, they still suffer from three main issues, that are, the weak stability due to the performance collapse, the poor generalization ability of the searched architectures, and the inferior robustness to different kinds of proxies (i.e., computationally reduced search configurations). To solve the search stability and searched architecture's generalization problems, a simple-but-effective regularization method, termed as Beta-Decay, is proposed to regularize the DARTS-based NAS searching process (referred as $\beta$-DARTS). Specifically, Beta-Decay regularization can impose constraints to keep the value and variance of activated architecture parameters from being too large, thereby ensuring fair competition among architecture parameters and making the supernet less sensitive to the impact of input on the operation set. In-depth theoretical analyses on how it works and why it works are provided, and comprehensive experiments on a variety of search spaces and datasets validate that Beta-Decay regularization can help to stabilize the searching process and make the searched network more transferable across different datasets. To address the proxy robustness problem, we first benchmark differentiable NAS methods under a wide range of proxy data, proxy channels, proxy layers, and proxy epochs, since the robustness of NAS under different kinds of proxies has not been explored before. We then conclude some interesting findings and find that $\beta$-DARTS always achieves the best result among all compared NAS methods under almost all proxy settings. We further introduce the novel flooding regularization to the weight optimization of $\beta$-DARTS (termed as Bi-level regularization), and experimentally and theoretically verify its effectiveness for improving the proxy robustness of differentiable NAS. In summary, our search scheme shows lots of outstanding properties for practical applications
|
| 650 |
|
4 |
|a Journal Article
|
| 700 |
1 |
|
|a He, Tong
|e verfasserin
|4 aut
|
| 700 |
1 |
|
|a Li, Baopu
|e verfasserin
|4 aut
|
| 700 |
1 |
|
|a Chen, Tao
|e verfasserin
|4 aut
|
| 700 |
1 |
|
|a Bai, Lei
|e verfasserin
|4 aut
|
| 700 |
1 |
|
|a Ouyang, Wanli
|e verfasserin
|4 aut
|
| 773 |
0 |
8 |
|i Enthalten in
|t IEEE transactions on pattern analysis and machine intelligence
|d 1979
|g PP(2025) vom: 01. Okt.
|w (DE-627)NLM098212257
|x 1939-3539
|7 nnas
|
| 773 |
1 |
8 |
|g volume:PP
|g year:2025
|g day:01
|g month:10
|
| 856 |
4 |
0 |
|u http://dx.doi.org/10.1109/TPAMI.2025.3616249
|3 Volltext
|
| 912 |
|
|
|a GBV_USEFLAG_A
|
| 912 |
|
|
|a SYSFLAG_A
|
| 912 |
|
|
|a GBV_NLM
|
| 912 |
|
|
|a GBV_ILN_350
|
| 951 |
|
|
|a AR
|
| 952 |
|
|
|d PP
|j 2025
|b 01
|c 10
|