Self-supervised pre-trained neural network for quantum natural language processing.
Journal:
Neural networks : the official journal of the International Neural Network Society
PMID:
39671984
Abstract
Quantum computing models have propelled advances in many application domains. However, in the field of natural language processing (NLP), quantum computing models are limited in representation capacity due to the high linearity of the underlying quantum computing architecture. This work attempts to address this limitation by leveraging the concept of self-supervised pre-training, a paradigm that has been propelling the rocketing development of NLP, to increase the power of quantum NLP models on the representation level. Specifically, we present a self-supervised pre-training approach to train quantum encodings of sentences, and fine-tune quantum circuits for downstream tasks on its basis. Experiments show that pre-trained mechanism brings remarkable improvement over end-to-end pure quantum models, yielding meaningful prediction results on a variety of downstream text classification datasets.