Robust one-class support vector machine.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

One-Class Support Vector Machine (OCSVM) is an effective algorithm in one-class classification task. However, it exhibits sensitivity to noise and outliers. Current solutions often employ bounded loss functions that impose finite but relatively large penalties on noise or outliers, and these loss functions suffer from limitations of discontinuity and non-differentiability. To address these issues, this paper introduces a novel, continuous, smooth, and differentiable loss function, namely Quadratic Type Squared Error Loss Function (QTSELF), and proposes a more robust OCSVM (Q-OCSVM). Q-OCSVM not only differentiates samples based on their positions and applies distinct treatments accordingly but also enhances model robustness by imposing minimal penalties on noise and outliers. Moreover, the elegant mathematical properties of the loss function facilitate model optimization. Theoretical analysis utilizes Rademacher complexity theory to conduct the generalization error bound of the model. Momentum method is used to optimize Q-OCSVM. Extensive experiments convincingly demonstrate that Q-OCSVM outperforms the benchmark techniques.

Authors

  • Xiaoxi Zhao
    College of Bioinformatics Science and Technology, Harbin Medical University, Harbin, 150081, China.
  • Yingjie Tian
    Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing 100190, China; Key Laboratory of Big Data Mining and Knowledge Management, Chinese Academy of Sciences, Beijing 100190, China. Electronic address: tyj@ucas.ac.cn.
  • Chonghua Zheng
    School of Management, Hangzhou Dianzi University, Hangzhou 310018, China.