Back to Search
Start Over
Distributed optimization for deep learning with gossip exchange
- Source :
- Neurocomputing, Neurocomputing, Elsevier, 2019, 330, pp.287-296. ⟨10.1016/j.neucom.2018.11.002⟩
- Publication Year :
- 2019
- Publisher :
- Elsevier BV, 2019.
-
Abstract
- International audience; We address the issue of speeding up the training of convolutional neural networks by studying a distributed method adapted to stochastic gradient descent. Our parallel optimization setup uses several threads, each applying individual gradient descents on a local variable. We propose a new way of sharing information between different threads based on gossip algorithms that show good consensus convergence properties. Our method called GoSGD has the advantage to be fully asynchronous and decentralized.
- Subjects :
- Optimization
0209 industrial biotechnology
Theoretical computer science
Computer science
Cognitive Neuroscience
Distributed gradient descent
Gossip
02 engineering and technology
[INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]
Convolutional neural network
020901 industrial engineering & automation
[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing
Artificial Intelligence
0202 electrical engineering, electronic engineering, information engineering
Learning
Artificial neural network
business.industry
Deep learning
Deep
Local variable
Computer Science Applications
Stochastic gradient descent
Asynchronous communication
020201 artificial intelligence & image processing
Artificial intelligence
business
Neural networks
Subjects
Details
- ISSN :
- 09252312
- Volume :
- 330
- Database :
- OpenAIRE
- Journal :
- Neurocomputing
- Accession number :
- edsair.doi.dedup.....ff2638e51048eb8b20abfcaf40bf3d18
- Full Text :
- https://doi.org/10.1016/j.neucom.2018.11.002