4479

Rapid Texture-based Volume Rendering

Chen Shihao, He Guiqing, Hao Chongyang
Inst. of Electron. & Inf., Northwest Poly-Tech. Univ., Xi’an, China
International Conference on Environmental Science and Information Application Technology, 2009. ESIAT 2009

@inproceedings{shihao2009rapid,

   title={Rapid Texture-based Volume Rendering},

   author={Shihao, C. and Guiqing, H. and Chongyang, H.},

   booktitle={Environmental Science and Information Application Technology, 2009. ESIAT 2009. International Conference on},

   volume={2},

   pages={575–578},

   year={2009},

   organization={IEEE}

}

Source Source   

670

views

Nowadays, man can get a great number of 3D data sets from different sources is common in medical diagnosis but how to explore the information contents of these data sets is still a problem. One effective method is with computer aid rendering the volume. As the 3D datasets are usually in large scalar, the capability of a single CPU to rendering is not sufficient to achieve interactivity. Direct volume rendering via 3D textures has positioned itself as an efficient tool for the display and visual analysis of volumetric scalar fields. In this paper a rapid texture based volume visualization method is proposed. The method exploits hardware-assisted texture mapping, re-samples volume data, represented as a stack of 3D texture, onto a sampling surface or so called proxy geometry. For each texel of a slice it perform a fetch to the 3D texture and performs fusion and shading using a fragment shadier. In order to speed up the rendering, the integration of acceleration techniques to reduce per-fragment operations for texture based volume rendering also will be addressed. At last we demonstrate the effectiveness of our method with several data sets on the GeForce 8800 GTX graphics card. It proved that the proposed method can generate high-quality visual representations of the 3D data sets at interactive rates.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2017 hgpu.org

All rights belong to the respective authors

Contact us: