AIP-TranLAC: A Transformer-Based Method Integrating LSTM and Attention Mechanism for Predicting Anti-inflammatory Peptides.

Journal: Interdisciplinary sciences, computational life sciences
Published Date:

Abstract

Anti-inflammatory peptides (AIPs) have emerged as potential therapeutic candidates for managing various inflammatory disorders, but their computational identification remains challenging. We propose AIP-TranLAC, a novel deep learning framework that integrates Transformer-based embedding, bidirectional long short-term memory (Bi-LSTM), multi-head attention, and convolutional neural network (CNN) to classify AIPs accurately. Our model achieves superior performance on benchmark and independent test datasets, demonstrating significant improvements over existing methods. The hybrid architecture effectively captures local and global sequence patterns, while interpretability analyses reveal critical amino acid residues. With robust performance on imbalanced data and open-source availability, AIP-TranLAC provides a powerful tool for accelerating therapeutic peptide discovery and inflammation research. For reproducibility purposes, we have released the codebase, trained models, and all supporting data on GitHub ( https://github.com/Renjingyi123/AIP-TranLAC ).

Authors

  • Shengli Zhang
    Key Laboratory of Animal Genetics, Breeding and Reproduction, Ministry of Agriculture & National Engineering Laboratory for Animal Breeding, College of Animal Science and Technology, China Agricultural University Beijing, China.
  • Jingyi Ren
    School of Mathematics and Statistics, Xidian University, Xi'an, 710071, P. R. China.

Keywords

No keywords available for this article.