12924

HACC: Simulating Sky Surveys on State-of-the-Art Supercomputing Architectures

Salman Habib, Adrian Pope, Hal Finkel, Nicholas Frontiere, Katrin Heitmann, David Daniel, Patricia Fasel, Vitali Morozov, George Zagaris, Tom Peterka, Venkatram Vishwanath, Zarija Lukic, Saba Sehrish, Wei-keng Liao
High Energy Physics Division, Argonne National Laboratory, Lemont, IL 60439, USA
arXiv:1410.2805 [astro-ph.IM], (8 Oct 2014)

@article{2014arXiv1410.2805H,

   author={Habib}, S. and {Pope}, A. and {Finkel}, H. and {Frontiere}, N. and {Heitmann}, K. and {Daniel}, D. and {Fasel}, P. and {Morozov}, V. and {Zagaris}, G. and {Peterka}, T. and {Vishwanath}, V. and {Lukic}, Z. and {Sehrish}, S. and {Liao}, W.-k.},

   title={"{HACC: Simulating Sky Surveys on State-of-the-Art Supercomputing Architectures}"},

   journal={ArXiv e-prints},

   archivePrefix={"arXiv"},

   eprint={1410.2805},

   primaryClass={"astro-ph.IM"},

   keywords={Astrophysics – Instrumentation and Methods for Astrophysics, Astrophysics – Cosmology and Nongalactic Astrophysics},

   year={2014},

   month={oct},

   adsurl={http://adsabs.harvard.edu/abs/2014arXiv1410.2805H},

   adsnote={Provided by the SAO/NASA Astrophysics Data System}

}

Download Download (PDF)   View View   Source Source   

2484

views

Current and future surveys of large-scale cosmic structure are associated with a massive and complex datastream to study, characterize, and ultimately understand the physics behind the two major components of the ‘Dark Universe’, dark energy and dark matter. In addition, the surveys also probe primordial perturbations and carry out fundamental measurements, such as determining the sum of neutrino masses. Large-scale simulations of structure formation in the Universe play a critical role in the interpretation of the data and extraction of the physics of interest. Just as survey instruments continue to grow in size and complexity, so do the supercomputers that enable these simulations. Here we report on HACC (Hardware/Hybrid Accelerated Cosmology Code), a recently developed and evolving cosmology N-body code framework, designed to run efficiently on diverse computing architectures and to scale to millions of cores and beyond. HACC can run on all current supercomputer architectures and supports a variety of programming models and algorithms. It has been demonstrated at scale on Cell- and GPU-accelerated systems, standard multi-core node clusters, and Blue Gene systems. HACC’s design allows for ease of portability, and at the same time, high levels of sustained performance on the fastest supercomputers available. We present a description of the design philosophy of HACC, the underlying algorithms and code structure, and outline implementation details for several specific architectures. We show selected accuracy and performance results from some of the largest high resolution cosmological simulations so far performed, including benchmarks evolving more than 3.6 trillion particles.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: