1. Improving I/O Performance for Exascale Applications Through Online Data Layout Reorganization
- Author
-
Wan, Lipeng, Huebl, Axel, Gu, Junmin, Poeschel, Franz, Gainaru, Ana, Wang, Ruonan, Chen, Jieyang, Liang, Xin, Ganyushin, Dmitry, Munson, Todd, Foster, Ian, Vay, Jean-Luc, Podhorszki, Norbert, Wu, Kesheng, and Klasky, Scott
- Subjects
Information and Computing Sciences ,Applied Computing ,Layout ,Arrays ,Heuristic algorithms ,Computational modeling ,Performance evaluation ,Optimization ,Distributed databases ,Parallel IO ,data layout ,IO performance ,WarpX ,data access optimization ,Computer Software ,Distributed Computing ,Communications Technologies ,Distributed computing and systems software - Abstract
The applications being developed within the U.S. Exascale Computing Project (ECP) to run on imminent Exascale computers will generate scientific results with unprecedented fidelity and record turn-around time. Many of these codes are based on particle-mesh methods and use advanced algorithms, especially dynamic load-balancing and mesh-refinement, to achieve high performance on Exascale machines. Yet, as such algorithms improve parallel application efficiency, they raise new challenges for I/O logic due to their irregular and dynamic data distributions. Thus, while the enormous data rates of Exascale simulations already challenge existing file system write strategies, the need for efficient read and processing of generated data introduces additional constraints on the data layout strategies that can be used when writing data to secondary storage. We review these I/O challenges and introduce two online data layout reorganization approaches for achieving good tradeoffs between read and write performance. We demonstrate the benefits of using these two approaches for the ECP particle-in-cell simulation WarpX, which serves as a motif for a large class of important Exascale applications. We show that by understanding application I/O patterns and carefully designing data layouts we can increase read performance by more than 80 percent.
- Published
- 2022