Span-aware pre-trained network with deep information bottleneck for scientific entity relation extraction.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Feb 11, 2025
Abstract
Scientific entity relation extraction intends to promote the performance of each subtask through exploring the contextual representations with rich scientific semantics. However, most of existing models encounter the dilemma of scientific semantic dilution, where task-irrelevant information entangles with task-relevant information making science-friendly representation learning challenging. In addition, existing models isolate task-relevant information among subtasks, undermining the coherence of scientific semantics and consequently impairing the performance of each subtask. To deal with these challenges, a novel and effective Span-aware Pre-trained network with deep Information Bottleneck (SpIB) is proposed, which aims to conduct the scientific entity and relation extraction by minimizing task-irrelevant information and meanwhile maximizing the relatedness of task-relevant information. Specifically, SpIB model includes a minimum span-based representation learning (SRL) module and a relatedness-oriented task-relevant representation learning (TRL) module to disentangle the task-irrelevant information and discover the relatedness hidden in task-relevant information across subtasks. Then, an information minimum-maximum strategy is designed to minimize the mutual information of span-based representations and maximize the multivariate information of task-relevant representations. Finally, we design a unified loss function to simultaneously optimize the learned span-based and task-relevant representations. Experimental results on several scientific datasets, SciERC, ADE, BioRelEx, show the superiority of the proposed SpIB model over various the state-of-the-art models. The source code is publicly available at https://github.com/SWT-AITeam/SpIB.