Neurodynamical classifiers with low model complexity.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an upper bound on the Vapnik-Chervonenkis (VC) dimension. The VC dimension measures the capacity or model complexity of a learning machine. Vapnik's risk formula indicates that models with smaller VC dimension are expected to show improved generalization. On many benchmark datasets, the MCM generalizes better than SVMs and uses far fewer support vectors than the number used by SVMs. In this paper, we describe a neural network that converges to the MCM solution. We employ the MCM neurodynamical system as the final layer of a neural network architecture. Our approach also optimizes the weights of all layers in order to minimize the objective, which is a combination of a bound on the VC dimension and the classification error. We illustrate the use of this model for robust binary and multi-class classification. Numerical experiments on benchmark datasets from the UCI repository show that the proposed approach is scalable and accurate, and learns models with improved accuracies and fewer support vectors.

Authors

  • Himanshu Pant
    Department of Electrical Engineering, Indian Institute of Technology, Delhi, India. Electronic address: eez138524@ee.iitd.ac.in.
  • Sumit Soman
    Department of Electrical Engineering, Indian Institute of Technology, Delhi, India. Electronic address: eez127509@ee.iitd.ac.in.
  • Jayadeva
    Department of Electrical Engineering, Indian Institute of Technology, Delhi, India. Electronic address: jayadeva@ee.iitd.ac.in.
  • Amit Bhaya
    Department of Electrical Engineering (PEE), Graduate School of Engineering (COPPE), Federal University of Rio de Janeiro (UFRJ), Rio de Janeiro, Brazil. Electronic address: amit@nacad.ufrj.br.