Relu-based activations: Analysis and experimental study for deep learning
V. Vargas, D. Guijo-Rubio , P. Gutiérrez, C. Hervás-Martínez
Conference of the spanish association for artificial intelligence, pp. 33-43, 2021Abstract
Activation functions are used in neural networks as a tool to introduce non-linear transformations into the model and, thus, enhance its representation capabilities. They also determine the output range of the hidden layers and the final output. Traditionally, artificial neural networks mainly used the sigmoid activation function as the depth of the network was limited. Nevertheless, this function tends to saturate the gradients when the number of hidden layers increases. For that reason, in the last years, most of the works published related to deep learning and convolutional networks use the Rectified Linear Unit (ReLU), given that it provides good convergence properties and speeds up the training process thanks to the simplicity of its derivative. However, this function has some known drawbacks that gave rise to new proposals of alternatives activation functions based on ReLU. In this work, we describe, analyse and compare different recently proposed alternatives to test whether these functions improve the performance of deep learning models regarding the standard ReLU.
Cite this publication
BibTex
@inproceedings{vargas2021relu-based, author = {Víctor Manuel Vargas and David Guijo-Rubio and Pedro Antonio Gutiérrez and César Hervás-Martínez}, title = {Relu-based activations: Analysis and experimental study for deep learning}, booktitle = {Conference of the spanish association for artificial intelligence}, year = {2021}, pages = {33--43}, doi = {10.1007/978-3-030-85713-4_4} }
APA
Vargas, V., Guijo-Rubio, D., Gutiérrez, P., Hervás-Martínez, C. (2021). Relu-based activations: Analysis and experimental study for deep learning. In Conference of the spanish association for artificial intelligence (pp. 33-43).
CV
V.M. Vargas, D. Guijo-Rubio (CA), P.A. Gutiérrez, C. Hervás-Martínez, (1/4) "Relu-based activations: Analysis and experimental study for deep learning". Conference of the spanish association for artificial intelligence, pp. 33-43, 2021.
RIS
TY - CONF T1 - Relu-based activations: Analysis and experimental study for deep learning T2 - Conference of the spanish association for artificial intelligence AU - Vargas, Víctor Manuel AU - Guijo-Rubio, David AU - Gutiérrez, Pedro Antonio AU - Hervás-Martínez, César JO - Conference of the spanish association for artificial intelligence JA - Conference of the spanish association for artificial intelligence Y1 - 2021 PY - 2021 SP - 33 EP - 43 DO - 10.1007/978-3-030-85713-4_4 ER -