An Improved Monte Carlo Ray Tracing for Large-Scale Rendering in Hadoop
Key Laboratory for Embedded and Network Computing of Hunan Province, Hunan University, ChangSha, China
International Conference on Computer Science and Service System (CSSS 2014), 2014
@article{li2014improved,
title={An Improved Monte Carlo Ray Tracing for Large-Scale Rendering in Hadoop},
author={Li, Rui},
year={2014}
}
To improve the performance of large-scale rendering, it requires not only a good view of data structure, but also less disk and network access, especially for achieving the realistic visual effects. This paper presents an optimization method of global illumination rendering for large datasets. We improved the previous rendering algorithm based on Monte Carlo ray tracing and the scheduling grids, and reduced the remote reads by slightly organizing the original data with considerations of locality and coherence. We implemented the rendering system in a Hadoop cluster of commodity PCs without high-end hardware. The large scene data are processed in splits by MapReduce framework, which increases scalability and reliability. The result shows that our algorithm of scheduling rays for each data split fits with large-scale scene and takes less reads and rendering time than previous works.
June 17, 2014 by hgpu