2457

A Variational Model for Interactive Shape Prior Segmentation and Real-Time Tracking

Manuel Werlberger, Thomas Pock, Markus Unger, Horst Bischof
Institute for Computer Graphics and Vision, Graz University of Technology
Scale Space and Variational Methods in Computer Vision, Lecture Notes in Computer Science, 2009, Volume 5567/2009, 200-211

@article{werlberger2009variational,

   title={A variational model for interactive shape prior segmentation and real-time tracking},

   author={Werlberger, M. and Pock, T. and Unger, M. and Bischof, H.},

   journal={Scale Space and Variational Methods in Computer Vision},

   pages={200–211},

   year={2009},

   publisher={Springer}

}

Download Download (PDF)   View View   Source Source   

1960

views

In this paper, we introduce a semi-automated segmentation method based on minimizing the Geodesic Active Contour energy incorporating a shape prior. We increase the robustness of the segmentation result using the additional shape information that represents the desired structure. Furthermore the user has the possibility to take corrective actions during the segmentation and adapt the shape prior position. Interaction is often desirable when processing difficult data like in medical applications. To facilitate the user interaction we add a shape deformation which allows to change the shape position manually by the user and automatically in terms of underlying image features. Using a variational formulation, the optimization can be done in a globally optimal manner for a fixed shape representation. To obtain real-time behavior, which is especially important for an interactive tool, the whole method is implemented on the GPU. Experiments are done on medical, as well as on video data and camera streams that are processed in real-time. In terms of medical data we compare our method with a segmentation done by an expert. The GPU based binaries will be available online on our homepage.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: