Self-supervised pre-trained neural network for quantum natural language processing.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Quantum computing models have propelled advances in many application domains. However, in the field of natural language processing (NLP), quantum computing models are limited in representation capacity due to the high linearity of the underlying quantum computing architecture. This work attempts to address this limitation by leveraging the concept of self-supervised pre-training, a paradigm that has been propelling the rocketing development of NLP, to increase the power of quantum NLP models on the representation level. Specifically, we present a self-supervised pre-training approach to train quantum encodings of sentences, and fine-tune quantum circuits for downstream tasks on its basis. Experiments show that pre-trained mechanism brings remarkable improvement over end-to-end pure quantum models, yielding meaningful prediction results on a variety of downstream text classification datasets.

Authors

  • Ben Yao
    Department of Computer Science, University of Copenhagen, Copenhagen, Denmark. Electronic address: ben.yao@di.ku.dk.
  • Prayag Tiwari
    Department of Information Engineering, University of Padova, Italy. Electronic address: prayag.tiwari@dei.unipd.it.
  • Qiuchi Li
    Department of Computer Science, University of Copenhagen, Copenhagen, 2100, Denmark. Electronic address: qiuchi.li@di.ku.dk.