1544

Perceptual enhancement of two-level volume rendering

Andrew Corcoran, Niall Redmond, John Dingliana
Trinity College Dublin, 0.19 Lloyd Building, College Green, Dublin 2, Ireland
Computers & Graphics, Volume 34, Issue 4, August 2010, Pages 388-397 (08 April 2010)

@article{corcoran2010perceptual,

   title={Perceptual enhancement of two-level volume rendering},

   author={Corcoran, A. and Redmond, N. and Dingliana, J.},

   journal={Computers & Graphics},

   volume={34},

   number={4},

   pages={388–397},

   issn={0097-8493},

   year={2010},

   publisher={Elsevier}

}

Download Download (PDF)   View View   Source Source   

1602

views

We present a system for interactive visualisation of 3D volumetric medical datasets combined with perceptual evaluation of how such visualizations can affect a user’s interpretation of scenes and attention. Enhancements to traditional volume renderings are provided through a two-level volume rendering strategy which employs fast GPU-based Direct Volume Rendering (DVR) coupled with an additional layer of perceptual cues derived from various techniques from the non-photorealistic rendering (NPR) literature. The two-level approach allows us to successfully separate the most relevant data from peripheral extraneous detail enabling the user to more effectively understand the visual information. Peripheral details are abstracted but sufficiently retained in order to provide spatial reference. We perform a number of perceptual user experiments which test how this approach affects a user’s attention and ability to determine the shape of an object. Results indicate that our approach can provide a significant improvement in user perception of shape in complex visualizations, especially when a user has little or no prior knowledge of the data. Our approach would prove extremely useful in technical, medical or scientific visualizations to improve understanding of detailed volumetric datasets.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: