Single Hidden Layer Neural Networks With Random Weights Based on Nondifferentiable Functions.
Journal:
IEEE transactions on neural networks and learning systems
Published Date:
May 14, 2025
Abstract
Computational algorithms that utilize nondifferentiable functions have proven highly effective in machine learning applications. This study introduces a novel framework for incorporating nondifferentiable functions into the objective functions of random-weight neural networks, specifically focusing on functional link random vector functional-link (RVFL) networks and extreme learning machines (ELMs). Our framework explores six nondifferentiable functions: the norms $ L_{1,1}$ , $ L_{1,2}$ , and $ L_{2,2}$ and the functions AbsMin, AbsMax, and a seminorm MaxMin. To enhance robustness, Fourier random assignments are applied as activation functions within these networks. The integration of these nondifferentiable functions into the objective functions of RVFL and ELM aims to reduce computational time in both training and testing stages, without compromising accuracy. We conducted extensive experiments on 12 benchmark datasets, encompassing small, medium, and large datasets, to evaluate the proposed algorithms against the $L_{2,1}$ -regularized random Fourier feature ELM ( $L_{2,1}$ -RF-ELM), which uses joint-norm regularization ( $L_{r,p}$ ) as documented in previous studies. Our findings indicate that the algorithms based on nondifferentiable functions not only achieve high accuracy but also significantly reduce computation time compared to the $L_{2,1}$ -based algorithm and other standard machine learning approaches.
Authors
Keywords
No keywords available for this article.