Publication Date

5-2020

Comments

Technical Report: UTEP-CS-20-42

Abstract

We want computations to be fast, and we want them to be understandable. As we show, the need for computations to be fast naturally leads to neural networks, with 1-layer networks being the fastest, and the need to be understandable naturally leads to fuzzy logic and to the corresponding "and"- and "or"-operations. Since we want our computations to be both fast and understandable, a natural question is: which "and"- and "or"-operations of fuzzy logic can be represented by the fastest (i.e., 1-layer) neural network? And a related question is: which activation functions allow such a representation? In this paper, we provide an answer to both questions: the only "and"- and "or"-operations that can be thus represented are max(0, a + b − 1) and min(a + b, 1), and the only activations functions allowing such a representation are equivalent to the rectified linear function -- the one used in deep learning. This result provides an additional explanation of why rectified linear neurons are so successful. With also show that with full 2-layer networks, we can compute practically any "and"- and "or"-operation.

Share

COinS