CL-GNN: Contrastive Learning and Graph Neural Network for Protein-Ligand Binding Affinity Prediction.
Journal:
Journal of chemical information and modeling
PMID:
39913849
Abstract
In the realm of drug discovery and design, the accurate prediction of protein-ligand binding affinity is of paramount importance as it underpins the functional interactions within biological systems. This study introduces a novel self-supervised learning (SSL) framework that combines contrastive learning and graph neural networks (CL-GNN) for predicting protein-ligand binding affinities, which is a critical aspect of drug discovery. Traditional methods for affinity prediction are expensive and time-consuming, prompting the development of more efficient computational approaches. CL-GNN utilizes a contrastive learning strategy, a form of SSL, to learn from a large data set of 371 458 unique unlabeled protein-ligand complexes. By employing graph neural networks and molecular graph enhancement techniques, the model effectively captures protein-ligand interactions in a self-supervised manner. The fine-tuned model demonstrates competitive performance, achieving high Pearson's correlation coefficients and low root-mean-square errors on benchmark data sets. The proposed method outperforms existing machine learning models, showcasing its potential for accelerating the drug development process. The method effectively quantifies the similarity between protein-ligand complex representations learned in the pretraining and downstream testing phases through cosine similarity assessment. This approach not only revealed potential connections between complexes in their binding properties but also provided new insights into the understanding of drug mechanisms of action. In addition, the transparency of the model is significantly improved by visualizing the importance of key protein residues and ligand atoms. This visualization tool provides insight into the model's predictive decision-making process, providing key biological insights for drug design and optimization.