17535

From MPI to MPI+OpenACC: Conversion of a legacy FORTRAN PCG solver for the spherical Laplace equation

Ronald M. Caplan, Zoran Mikic, Jon A. Linker
Predictive Science Inc., 9990 Mesa Rim Road Suite 170, San Diego, CA 92121
arXiv:1709.01126 [cs.MS], (4 Sep 2017)

@article{caplan2017from,

   title={From MPI to MPI+OpenACC: Conversion of a legacy FORTRAN PCG solver for the spherical Laplace equation},

   author={Caplan, Ronald M. and Mikic, Zoran and Linker, Jon A.},

   year={2017},

   month={sep},

   archivePrefix={"arXiv"},

   primaryClass={cs.MS}

}

Download Download (PDF)   View View   Source Source   

3178

views

A real-world example of adding OpenACC to a legacy MPI FORTRAN Preconditioned Conjugate Gradient code is described, and timing results for multi-node multi-GPU runs are shown. The code is used to obtain three-dimensional spherical solutions to the Laplace equation. Its application is finding potential field solutions of the solar corona, a useful tool in space weather modeling. We highlight key tips, strategies, and challenges faced when adding OpenACC, including linking FORTRAN code to the cuSparse library, using CUDA-aware MPI, maintaining portability, and dealing with multi-node, multi-GPU run-time environments. Timing results are shown for the code running with MPI-only (up to 1728 CPU cores) and with MPI+OpenACC (up to 64 NVIDIA P100 GPUs). Performance portability is also addressed, including results using MPI+OpenACC for multi-core x86 CPUs.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: