Publication Date
4-1-2024
Abstract
In this paper, we show that there is a -- somewhat unexpected -- common trend behind several seemingly unrelated historic transitions: from Aristotelian physics to modern (Newton's) approach, from crisp sets (such as intervals) to fuzzy sets, and from traditional neural networks, with close-to-step-function sigmoid activation functions to modern successful deep neural networks that use a completely different ReLU activation function. In all these cases, the main idea of the corresponding transition can be explained, in mathematical terms, as going from the first order to second order differential equations.
Comments
Technical Report: UTEP-CS-24-13a
To appear in Proceedings of the NAFIPS International Conference on Fuzzy Systems, Soft Computing, and Explainable AI NAFIPS'2024, South Padre Island, Texas, May 27-29, 2024