PWLU : Learning Specialized Activation Functions With the Piecewise Linear Unit

The choice of activation functions is crucial to deep neural networks. ReLU is a popular hand-designed activation function. Swish, the automatically searched activation function, outperforms ReLU on many challenging datasets. However, the search method has two main drawbacks. First, the tree-based s...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 45(2023), 10 vom: 01. Okt., Seite 12269-12286
1. Verfasser: Zhu, Zezhou (VerfasserIn)
Weitere Verfasser: Zhou, Yucong, Dong, Yuan, Zhong, Zhao
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2023
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article