ActTRANS: Functional classification in active transport proteins based on transfer learning and contextual representations.

Journal: Computational biology and chemistry
Published Date:

Abstract

MOTIVATION: Primary and secondary active transport are two types of active transport that involve using energy to move the substances. Active transport mechanisms do use proteins to assist in transport and play essential roles to regulate the traffic of ions or small molecules across a cell membrane against the concentration gradient. In this study, the two main types of proteins involved in such transport are classified from transmembrane transport proteins. We propose a Support Vector Machine (SVM) with contextualized word embeddings from Bidirectional Encoder Representations from Transformers (BERT) to represent protein sequences. BERT is a powerful model in transfer learning, a deep learning language representation model developed by Google and one of the highest performing pre-trained model for Natural Language Processing (NLP) tasks. The idea of transfer learning with pre-trained model from BERT is applied to extract fixed feature vectors from the hidden layers and learn contextual relations between amino acids in the protein sequence. Therefore, the contextualized word representations of proteins are introduced to effectively model complex structures of amino acids in the sequence and the variations of these amino acids in the context. By generating context information, we capture multiple meanings for the same amino acid to reveal the importance of specific residues in the protein sequence.

Authors

  • Semmy Wellem Taju
    Department of Computer Science and Engineering, Yuan Ze University, Chung-Li 32003, Taiwan.
  • Syed Muazzam Ali Shah
    Department of Computer Science & Engineering, Yuan Ze University, Chungli, 32003, Taiwan.
  • Yu-Yen Ou
    Department of Computer Science and Engineering, Yuan Ze University, Chung-Li, 32003, Taiwan. Electronic address: yien@saturn.yzu.edu.tw.