Exploring the Millennium Run – Scalable Rendering of Large-Scale Cosmological Datasets
Computer Graphics and Visualization Group, Technische Universitat Munchen, Munich, Germany
IEEE Transactions on Visualization and Computer Graphics, 2009
@article{fraedrich2009exploring,
title={Exploring the Millennium Run – Scalable Rendering of Large-Scale Cosmological Datasets},
author={Fraedrich, R. and Schneider, J. and Westermann, R.},
journal={IEEE Transactions on Visualization and Computer Graphics},
pages={1251–1258},
year={2009},
publisher={IEEE Computer Society}
}
In this paper we investigate scalability limitations in the visualization of large-scale particle-based cosmological simulations, and we present methods to reduce these limitations on current PC architectures. To minimize the amount of data to be streamed from disk to the graphics subsystem, we propose a visually continuous level-of-detail (LOD) particle representation based on a hierarchical quantization scheme for particle coordinates and rules for generating coarse particle distributions. Given the maximal world space error per level, our LOD selection technique guarantees a sub-pixel screen space error during rendering. A brick-based page-tree allows to further reduce the number of disk seek operations to be performed. Additional particle quantities like density, velocity dispersion, and radius are compressed at no visible loss using vector quantization of logarithmically encoded floating point values. By fine-grain view-frustum culling and presence acceleration in a geometry shader the required geometry throughput on the GPU can be significantly reduced. We validate the quality and scalability of our method by presenting visualizations of a particle-based cosmological dark-matter simulation exceeding 10 billion elements.
September 4, 2011 by hgpu