Effects of data and entity ablation on multitask learning models for biomedical entity recognition.
Journal:
Journal of biomedical informatics
Published Date:
Apr 9, 2022
Abstract
MOTIVATION: Training domain-specific named entity recognition (NER) models requires high quality hand curated gold standard datasets which are time-consuming and expensive to create. Furthermore, the storage and memory required to deploy NLP models can be prohibitive when the number of tasks is large. In this work, we explore utilizing multi-task learning to reduce the amount of training data needed to train new domain-specific models. We evaluate our system across 22 distinct biomedical NER datasets and evaluate the extent to which transfer learning helps task performance using two forms of ablation.