Complex quantized minimum error entropy with fiducial points: theory and application in model regression.
Journal:
Neural networks : the official journal of the International Neural Network Society
PMID:
40068497
Abstract
Minimum error entropy with fiducial points (MEEF) has gained significant attention due to its excellent performance in mitigating the adverse effects of non-Gaussian noise in the fields of machine learning and signal processing. However, the original MEEF algorithm suffers from high computational complexity due to the double summation of error samples. The quantized MEEF (QMEEF), proposed by Zheng et al. alleviates this computational burden through strategic quantization techniques, providing a more efficient solution. In this paper, we extend the application of these techniques to the complex domain, introducing complex QMEEF (CQMEEF). We theoretically introduce and prove the fundamental properties and convergence of CQMEEF. Furthermore, we apply this novel method to the training of a range of Linear-in-parameters (LIP) models, demonstrating its broad applicability. Experimental results show that CQMEEF achieves high precision in regression tasks involving various noise-corrupted datasets, exhibiting effectiveness under unfavorable conditions, and surpassing existing methods across critical performance metrics. Consequently, CQMEEF not only offers an efficient computational alternative but also opens up new avenues for dealing with complex data in regression tasks.