Adaptive gradient scaling: integrating Adam and landscape modification for protein structure prediction

Abstract Background Protein structure prediction is one of the most important scientific problems, on the one hand, it is one of the NP-hard problems, and on the other hand, it has a wide range of applications including drug discovery and biotechnology development. Since experimental methods for str...

Full description

Saved in:
Bibliographic Details
Main Authors: Vitalii Kapitan, Michael Choi
Format: Article
Language:English
Published: BMC 2025-07-01
Series:BMC Bioinformatics
Subjects:
Online Access:https://doi.org/10.1186/s12859-025-06185-2
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Background Protein structure prediction is one of the most important scientific problems, on the one hand, it is one of the NP-hard problems, and on the other hand, it has a wide range of applications including drug discovery and biotechnology development. Since experimental methods for structure determination remain expensive and time-consuming, computational structure prediction offers a scalable and cost-effective alternative and application of machine learning in structural biology has revolutionized protein structure prediction. Despite their success, machine learning methods face fundamental limitations in optimizing complex high-dimensional energy landscapes, which motivates research into new methods to improve the robustness and performance of optimization algorithms. Results This study presents a novel approach to protein structure prediction by integrating the Landscape Modification (LM) method with the Adam optimizer for OpenFold. The main idea is to change the optimization dynamics by introducing a gradient scaling mechanism based on energy landscape transformations. LM dynamically adjusts gradients using a threshold parameter and a transformation function, thereby improving the optimizer’s ability to avoid local minima, more efficiently traverse flat or rough landscape regions, and potentially converge faster to global or high-quality local optima. By integrating simulated annealing into the LM approach, we propose LM SA, a variant designed to improve convergence stability while facilitating more efficient exploration of complex landscapes. Conclusion We compare the performance of standard Adam, LM, and LM SA on different datasets and computational conditions. Performance was evaluated using Loss function values, predicted Local Distance Difference Test (pLDDT), distance-based Root Mean Square Deviation (dRMSD), and Template Modeling (TM) scores. Our results show that LM and LM SA outperform the standard Adam across all metrics, showing faster convergence and better generalization, particularly on proteins not included in the training set. These results demonstrate that integrating landscape-aware gradient scaling into first-order optimizers advances research in computational optimization and improves prediction performance for complex problems such as protein folding.
ISSN:1471-2105