Non-monotonic convergence of online learning algorithms for perceptrons with noisy teacher.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Learning curves of simple perceptron were derived here. The learning curve of the perceptron learning with noisy teacher was shown to be non-monotonic, which has never appeared even though the learning curves have been analyzed for half a century. In this paper, we showed how this phenomenon occurs by analyzing the asymptotic property of the perceptron learning using a method in systems science, that is, calculating the eigenvalues of the system matrix and the corresponding eigenvectors. We also analyzed the AdaTron learning and the Hebbian learning in the same way and found that the learning curve of the AdaTron learning is non-monotonic whereas that of the Hebbian learning is monotonic.

Authors

  • Kazushi Ikeda
  • Arata Honda
    Nara Institute of Science and Technology, Ikoma, Nara, Japan. Electronic address: arata.honda@excite.jp.
  • Hiroaki Hanzawa
    Nara Institute of Science and Technology, Ikoma, Nara, Japan. Electronic address: h.hanzawax68@gmail.com.
  • Seiji Miyoshi
    Kansai University, Suita, Osaka, Japan. Electronic address: miyoshi@kansai-u.ac.jp.