Sorry, I don't understand your search. ×
Back to Search Start Over

Quantum-Enhanced Attention Mechanism in NLP: A Hybrid Classical-Quantum Approach

Authors :
Tomal, S. M. Yousuf Iqbal
Shafin, Abdullah Al
Bhattacharjee, Debojit
Amin, MD. Khairul
Shahir, Rafiad Sadat
Publication Year :
2025

Abstract

Transformer-based models have achieved remarkable results in natural language processing (NLP) tasks such as text classification and machine translation. However, their computational complexity and resource demands pose challenges for scalability and accessibility. This research proposes a hybrid quantum-classical transformer model that integrates a quantum-enhanced attention mechanism to address these limitations. By leveraging quantum kernel similarity and variational quantum circuits (VQC), the model captures intricate token dependencies while improving computational efficiency. Experimental results on the IMDb dataset demonstrate that the quantum-enhanced model outperforms the classical baseline across all key metrics, achieving a 1.5% improvement in accuracy (65.5% vs. 64%), precision, recall, and F1 score. Statistical significance tests validate these improvements, highlighting the robustness of the quantum approach. These findings illustrate the transformative potential of quantum-enhanced attention mechanisms in optimizing NLP architectures for real-world applications.<br />Comment: 23 pages, 9 figures, 5 tables

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.15630
Document Type :
Working Paper