RDE: A novel approach to improve the classification performance and expressivity of KDB.

Journal: PloS one
Published Date:

Abstract

Bayesian network classifiers (BNCs) have demonstrated competitive classification performance in a variety of real-world applications. A highly scalable BNC with high expressivity is extremely desirable. This paper proposes Redundant Dependence Elimination (RDE) for improving the classification performance and expressivity of k-dependence Bayesian classifier (KDB). To demonstrate the unique characteristics of each case, RDE identifies redundant conditional dependencies and then substitute/remove them. The learned personalized k-dependence Bayesian Classifier (PKDB) can achieve high-confidence conditional probabilities, and graphically interpret the dependency relationships between attributes. Two thyroid cancer datasets and four other cancer datasets from the UCI machine learning repository are selected for our experimental study. The experimental results prove the effectiveness of the proposed algorithm in terms of zero-one loss, bias, variance and AUC.

Authors

  • Hua Lou
    Changzhou College of Information Technology, ChangZhou, China.
  • Limin Wang
    Frontiers Science Center for Flexible Electronics, Xi'an Institute of Flexible Electronics (IFE) and Xi'an Institute of Biomedical Materials & Engineering, Northwestern Polytechnical University, 127 West Youyi Road, Xi'an 710072, P. R. China.
  • DingBo Duan
    Changzhou College of Information Technology, ChangZhou, China.
  • Cheng Yang
    State Key Laboratory of Medicinal Chemical Biology and College of Pharmacy, Nankai University, Tianjin 300071, China.
  • Musa Mammadov
    Faculty of Science and Technology, Federation University, Ballarat, Australia.