A prompt tuning method based on relation graphs for few-shot relation extraction.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Jan 31, 2025
Abstract
Prompt-tuning has recently proven effective in addressing few-shot tasks. However, task resources remain severely limited in the specific domain of few-shot relation extraction. Despite its successes, prompt-tuning faces challenges distinguishing between similar relations, resulting in occasional prediction errors. Therefore, it is critical to extract maximum information from these scarce resources. This paper introduces the integration of global relation graphs and local relation subgraphs into the prompt-tuning framework to tackle this issue and fully exploit the available resources for differentiating between various relations. A global relation graph is initially constructed to enhance feature representations of samples across different relations based on label consistency. Subsequently, this global relation graph is partitioned to create local relation subgraphs for each relation type, optimizing the feature representations of samples within the same relation. This dual approach effectively utilizes the limited supervised information and improves tuning efficiency. Additionally, recognizing the substantial semantic knowledge embedded in relation labels, this study integrates such knowledge into the prompt-tuning process. Extensive experiments conducted on four low-resource datasets validate the efficacy of the proposed method, demonstrating significant performance improvements. Notably, the model also exhibits robust performance in discerning similar relations.