Multi-Nyström Method Based on Multiple Kernel Learning for Large Scale Imbalanced Classification.

Journal: Computational intelligence and neuroscience
Published Date:

Abstract

Extensions of kernel methods for the class imbalance problems have been extensively studied. Although they work well in coping with nonlinear problems, the high computation and memory costs severely limit their application to real-world imbalanced tasks. The Nyström method is an effective technique to scale kernel methods. However, the standard Nyström method needs to sample a sufficiently large number of landmark points to ensure an accurate approximation, which seriously affects its efficiency. In this study, we propose a multi-Nyström method based on mixtures of Nyström approximations to avoid the explosion of subkernel matrix, whereas the optimization to mixture weights is embedded into the model training process by multiple kernel learning (MKL) algorithms to yield more accurate low-rank approximation. Moreover, we select subsets of landmark points according to the imbalance distribution to reduce the model's sensitivity to skewness. We also provide a kernel stability analysis of our method and show that the model solution error is bounded by weighted approximate errors, which can help us improve the learning process. Extensive experiments on several large scale datasets show that our method can achieve a higher classification accuracy and a dramatical speedup of MKL algorithms.

Authors

  • Ling Wang
    The State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University, #7 Jinsui Road, Guangzhou, Guangdong 510230, China.
  • Hongqiao Wang
    Department of Information Engineering, Rocket Force University of Engineering, Xi'an, 710025, China.
  • Guangyuan Fu
    College of Computer and Information Science, Southwest University, Chongqing 400715, China.