Publication Date

7-1-2023

Comments

Technical Report: UTEP-CS-23-32

Abstract

In the last decades, deep learning has led to spectacular successes. One of the reasons for these successes was the fact that deep neural networks use a special Rectified Linear Unit (ReLU) activation function s(x) = max(0,x). Why this activation function is so successful is largely a mystery. In this paper, we show that common sense ideas -- as formalized by fuzzy logic -- can explain this mysterious effectiveness.

Share

COinS