Back to Search Start Over

Decentralized Control of Partially Observable Markov Decision Processes using Belief Space Macro-actions

Authors :
Omidshafiei, Shayegan
Agha-mohammadi, Ali-akbar
Amato, Christopher
How, Jonathan P.
Publication Year :
2015

Abstract

The focus of this paper is on solving multi-robot planning problems in continuous spaces with partial observability. Decentralized partially observable Markov decision processes (Dec-POMDPs) are general models for multi-robot coordination problems, but representing and solving Dec-POMDPs is often intractable for large problems. To allow for a high-level representation that is natural for multi-robot problems and scalable to large discrete and continuous problems, this paper extends the Dec-POMDP model to the decentralized partially observable semi-Markov decision process (Dec-POSMDP). The Dec-POSMDP formulation allows asynchronous decision-making by the robots, which is crucial in multi-robot domains. We also present an algorithm for solving this Dec-POSMDP which is much more scalable than previous methods since it can incorporate closed-loop belief space macro-actions in planning. These macro-actions are automatically constructed to produce robust solutions. The proposed method's performance is evaluated on a complex multi-robot package delivery problem under uncertainty, showing that our approach can naturally represent multi-robot problems and provide high-quality solutions for large-scale problems.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1502.06030
Document Type :
Working Paper