Gated Orthogonal Recurrent Units: On Learning to Forget.

Journal: Neural computation
PMID:

Abstract

We present a novel recurrent neural network (RNN)-based model that combines the remembering ability of unitary evolution RNNs with the ability of gated RNNs to effectively forget redundant or irrelevant information in its memory. We achieve this by extending restricted orthogonal evolution RNNs with a gating mechanism similar to gated recurrent unit RNNs with a reset gate and an update gate. Our model is able to outperform long short-term memory, gated recurrent units, and vanilla unitary or orthogonal RNNs on several long-term-dependency benchmark tasks. We empirically show that both orthogonal and unitary RNNs lack the ability to forget. This ability plays an important role in RNNs. We provide competitive results along with an analysis of our model on many natural sequential tasks, including question answering, speech spectrum prediction, character-level language modeling, and synthetic tasks that involve long-term dependencies such as algorithmic, denoising, and copying tasks.

Authors

  • Li Jing
    Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A. ljing@mit.edu.
  • Caglar Gulcehre
    University of Montreal, Montreal H3T, 1J4, Quebec, Canada ca9lar@gmail.com.
  • John Peurifoy
    Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A. jpeurifo@mit.edu.
  • Yichen Shen
    Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A. ycshen@mit.edu.
  • Max Tegmark
    Institute for Artificial Intelligence and Fundamental Interactions, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.
  • Marin Soljacic
    Massachusetts Institute of Technology, Cambridge, MA 02139, U.S.A. soljacic@mit.edu.
  • Yoshua Bengio
    Université de Montréal, Montréal QC H3T 1N8, Canada.