Convergence analysis of sparse TSK fuzzy systems based on spectral Dai-Yuan conjugate gradient and application to high-dimensional feature selection.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Aug 6, 2024
Abstract
Dealing with high-dimensional problems has always been a key and challenging issue in the field of fuzzy systems. Traditional Takagi-Sugeno-Kang (TSK) fuzzy systems face the challenges of the curse of dimensionality and computational complexity when applied to high-dimensional data. To overcome these challenges, this paper proposes a novel approach for optimizing TSK fuzzy systems by integrating the spectral Dai-Yuan conjugate gradient (SDYCG) algorithm and the smoothing group L regularization technique. This method aims to address the challenges faced by TSK fuzzy systems in handling high-dimensional problems. The smoothing group L regularization technique is employed to introduce sparsity, select relevant features, and improve the generalization ability of the model. The SDYCG algorithm effectively accelerates convergence and enhances the learning performance of the network. Furthermore, we prove the weak convergence and strong convergence of the new algorithm under the strong Wolfe criterion, which means that the gradient norm of the error function with respect to the weight vector converges to zero, and the weight sequence approaches a fixed point.