1. Analyzing effect of quadruple multiple sequence alignments on deep learning based protein inter-residue distance prediction
- Author
-
Charles Christoffer, Yuki Kagaya, Sai Raghavendra Maddhuri Venkata Subramaniya, Genki Terashi, Aashish Jain, and Daisuke Kihara
- Subjects
Models, Molecular ,0301 basic medicine ,Computer science ,Science ,Model prediction ,Article ,03 medical and health sciences ,Deep Learning ,0302 clinical medicine ,Sequence Analysis, Protein ,Feature (machine learning) ,Protein Interaction Domains and Motifs ,Layer (object-oriented design) ,Sequence ,Multidisciplinary ,Single model ,Artificial neural network ,business.industry ,Deep learning ,Proteins ,Pattern recognition ,Protein tertiary structure ,Computational biology and bioinformatics ,Protein Structure, Tertiary ,030104 developmental biology ,Caspases ,Medicine ,Neural Networks, Computer ,Artificial intelligence ,Structural biology ,business ,Sequence Alignment ,030217 neurology & neurosurgery - Abstract
Protein 3D structure prediction has advanced significantly in recent years due to improving contact prediction accuracy. This improvement has been largely due to deep learning approaches that predict inter-residue contacts and, more recently, distances using multiple sequence alignments (MSAs). In this work we present AttentiveDist, a novel approach that uses different MSAs generated with different E-values in a single model to increase the co-evolutionary information provided to the model. To determine the importance of each MSA’s feature at the inter-residue level, we added an attention layer to the deep neural network. We show that combining four MSAs of different E-value cutoffs improved the model prediction performance as compared to single E-value MSA features. A further improvement was observed when an attention layer was used and even more when additional prediction tasks of bond angle predictions were added. The improvement of distance predictions were successfully transferred to achieve better protein tertiary structure modeling.
- Published
- 2021
- Full Text
- View/download PDF