10102

The Astrophysical Multipurpose Software Environment

F.I. Pelupessy, A. van Elteren, N. de Vries, S.L.W. McMillan, N. Drost, S.F. Portegies Zwart
Leiden Observatory, Leiden University, PO Box 9513, 2300 RA, Leiden, The Netherlands
arXiv:1307.3016 [astro-ph.IM], (11 Jul 2013)

@article{2013arXiv1307.3016P,

   author={Pelupessy}, F.~I. and {van Elteren}, A. and {de Vries}, N. and {McMillan}, S.~L.~W. and {Drost}, N. and {Portegies Zwart}, S.~F.},

   title={"{The Astrophysical Multipurpose Software Environment}"},

   journal={ArXiv e-prints},

   archivePrefix={"arXiv"},

   eprint={1307.3016},

   primaryClass={"astro-ph.IM"},

   keywords={Astrophysics – Instrumentation and Methods for Astrophysics},

   year={2013},

   month={jul},

   adsurl={http://adsabs.harvard.edu/abs/2013arXiv1307.3016P},

   adsnote={Provided by the SAO/NASA Astrophysics Data System}

}

Download Download (PDF)   View View   Source Source   Source codes Source codes

Package:

2177

views

We present the open source Astrophysical Multi-purpose Software Environment (AMUSE, www.amusecode.org), a component library for performing astrophysical simulations involving different physical domains and scales. It couples existing codes within a Python framework based on a communication layer using MPI. The interfaces are standardized for each domain and their implementation based on MPI guarantees that the whole framework is well-suited for distributed computation. It includes facilities for unit handling and data storage. Currently it includes codes for gravitational dynamics, stellar evolution, hydrodynamics and radiative transfer. Within each domain the interfaces to the codes are as similar as possible. We describe the design and implementation of AMUSE, as well as the main components and community codes currently supported and we discuss the code interactions facilitated by the framework. Additionally, we demonstrate how AMUSE can be used to resolve complex astrophysical problems by presenting example applications.
Rating: 2.5/5. From 1 vote.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: