1. Speeding Up GDL-Based Message Passing Algorithms for Large-Scale DCOPs.
- Author
-
Khan, Md Mosaddek, Tran-Thanh, Long, Ramchurn, Sarvapali D, and Jennings, Nicholas R
- Subjects
MESSAGE passing (Computer science) ,MULTIAGENT systems ,GREEDY algorithms ,COST functions ,WIRELESS localization - Abstract
This paper develops a new approach to speed up Generalized Distributive Law (GDL) based message passing algorithms that are used to solve large-scale Distributed Constraint Optimization Problems (DCOPs) in multi-agent systems. In particular, we significantly reduce computation and communication costs in terms of convergence time for algorithms such as Max-Sum, Bounded Max-Sum, Fast Max-Sum, Bounded Fast Max-Sum, BnB Max-Sum, BnB Fast Max-Sum and Generalized Fast Belief Propagation. This is important since it is often observed that the outcome obtained from such algorithms becomes outdated or unusable if the optimization process takes too much time. Specifically, the issue of taking too long to complete the internal operation of a DCOP algorithm is even more severe and commonplace in a system where the algorithm has to deal with a large number of agents, tasks and resources. This, in turn, limits the practical scalability of such algorithms. In other words, an optimization algorithm can be used in larger systems if the completion time can be reduced. However, it is challenging to maintain the solution quality while minimizing the completion time. Considering this trade-off, we propose a generic message passing protocol for GDL-based algorithms that combines clustering with domain pruning, as well as the use of a regression method to determine the appropriate number of clusters for a given scenario. We empirically evaluate the performance of our method in a number of settings and find that it brings down the completion time by around 37–85% (1.6–6.5 times faster) for 100–900 nodes, and by around 47–91% (1.9–11 times faster) for 3000–10 000 nodes compared to the current state-of-the-art. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF