Back to Search
Start Over
Communication-efficient Quantum Algorithm for Distributed Machine Learning
- Publication Year :
- 2022
-
Abstract
- The growing demands of remote detection and increasing amount of training data make distributed machine learning under communication constraints a critical issue. This work provides a communication-efficient quantum algorithm that tackles two traditional machine learning problems, the least-square fitting and softmax regression problem, in the scenario where the data set is distributed across two parties. Our quantum algorithm finds the model parameters with a communication complexity of $O(\frac{\log_2(N)}{\epsilon})$, where $N$ is the number of data points and $\epsilon$ is the bound on parameter errors. Compared to classical algorithms and other quantum algorithms that achieve the same output task, our algorithm provides a communication advantage in the scaling with the data volume. The building block of our algorithm, the quantum-accelerated estimation of distributed inner product and Hamming distance, could be further applied to various tasks in distributed machine learning to accelerate communication.
- Subjects :
- Quantum Physics
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2209.04888
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1103/PhysRevLett.130.150602