Real-time Volumetric Haptic and Visual Burrhole Simulation
National Capital Area Medical Simulation Center, Uniformed Services University, Bethesda, MD
IEEE Virtual Reality Conference, 2007. VR ’07
@inproceedings{acosta2007real,
title={Real-time volumetric haptic and visual burrhole simulation},
author={Acosta, E. and Liu, A.},
booktitle={2007 IEEE Virtual Reality Conference},
pages={247–250},
year={2007},
organization={IEEE}
}
This paper describes real-time volumetric haptic and visual algorithms developed to simulate burrhole creation for a virtual reality-based craniotomy surgical simulator. A modified Voxmap point-shell algorithm (McNeely et al., 1999), (Renz et al., 2001) is created to simulate haptic interactions between bone cutting tools and voxel-based bone. New surface boundary detection and force feedback calculation methods help reduce "force discontinuities" of the original Voxmap point-shell algorithm. To maintain stable haptic update rates, new forces are calculated outside the haptics rendering loop. A multi-rate haptic solution (Cavusoglu and Tendick, 2000) is used to introduce calculated forces into the haptics loop and to interpolate forces between updates. A bone erosion method is also created to simulate bone drilling capabilities of different tools. 3D texture-based volume rendering is used to display the bone and to visually remove bone material due to drilling in real-time. Volumetric shading is computed by the GPU of the video card. The algorithms described make it possible to simulate several tools typically used for a craniotomy. Realistic 3D models are also created from real surgical tools and controlled by the haptic device
August 1, 2011 by hgpu