Back to Search
Start Over
Byzantine Fault Tolerant Distributed Stochastic Gradient Descent Based on Over-the-Air Computation.
- Source :
- IEEE Transactions on Communications; May2022, Vol. 70 Issue 5, p3204-3219, 16p
- Publication Year :
- 2022
-
Abstract
- Wireless distributed machine learning is envisaged to facilitate advanced learning services and applications in wireless networks consisting of devices with limited computing capability. Distributed machine learning algorithms are more vulnerable in wireless systems since information exchange in learning is limited by wireless resources and channel conditions. Moreover, their performance can be significantly degraded by attacks of the Byzantine devices, and information distorted by channel fading can be treated as Byzantine attacks. Consequently, protection of wireless distributed machine learning from Byzantine devices is paramount. Leveraging over-the-air computation, we put forth a novel wireless distributed stochastic gradient descent system which is resilient to Byzantine attacks. The proposed learning system is underpinned by two novel and distinct features which enable more accurate and faster distributed machine learning resilient to Byzantine attacks: collecting training data in the PS to obtain its own training results and grouping the distributed devices. We derive upper bounds of the mean square error of the global parameter when the proposed algorithms are used in the cases with and without Byzantine devices, and prove the convergence of the proposed algorithms with the derived bounds. The effectiveness of the algorithms is validated by showing the accuracy and convergence speed. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00906778
- Volume :
- 70
- Issue :
- 5
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Communications
- Publication Type :
- Academic Journal
- Accession number :
- 156931652
- Full Text :
- https://doi.org/10.1109/TCOMM.2022.3162576