Back to Search Start Over

Scaling Up ESM2 Architectures for Long Protein Sequences Analysis: Long and Quantized Approaches

Authors :
de Oliveira, Gabriel Bianchin
Pedrini, Helio
Dias, Zanoni
Publication Year :
2025

Abstract

Various approaches utilizing Transformer architectures have achieved state-of-the-art results in Natural Language Processing (NLP). Based on this success, numerous architectures have been proposed for other types of data, such as in biology, particularly for protein sequences. Notably among these are the ESM2 architectures, pre-trained on billions of proteins, which form the basis of various state-of-the-art approaches in the field. However, the ESM2 architectures have a limitation regarding input size, restricting it to 1,022 amino acids, which necessitates the use of preprocessing techniques to handle sequences longer than this limit. In this paper, we present the long and quantized versions of the ESM2 architectures, doubling the input size limit to 2,048 amino acids.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.07747
Document Type :
Working Paper
Full Text :
https://doi.org/10.5753/bsb.2024.244804