ANTS2 package: simulation and experimental data processing for Anger camera type detectors

A. Morozov, V. Solovov, R. Martins, F. Neves, V. Domingos, V. Chepel
LIP-Coimbra, Department of Physics, University of Coimbra, Coimbra, Portugal
arXiv:1602.07247 [physics.ins-det], (23 Feb 2016)


   title={ANTS2 package: simulation and experimental data processing for Anger camera type detectors},

   author={Morozov, A. and Solovov, V. and Martins, R. and Neves, F. and Domingos, V. and Chepel, V.},






Download Download (PDF)   View View   Source Source   Source codes Source codes




ANTS2 is a simulation and data processing package developed for position sensitive detectors with Anger camera type readout. The simulation module of ANTS2 is based on ROOT package from CERN, which is used to store the detector geometry and to perform 3D navigation. The module is capable of simulating particle sources, performing particle tracking, generating photons of primary and secondary scintillation, tracing optical photons and generating photosensor signals. The reconstruction module features several position reconstruction methods based on the statistical reconstruction algorithms (including GPU-based implementations), artificial neural networks and k-NN searches. The module can process simulated as well as imported experimental data containing photosensor signals. A custom library for B-spline parameterization of spatial response of photosensors is implemented which can be used to calculate and parameterize the spatial response of a detector. The package includes a graphical user interface with an extensive set of configuration, visualization and analysis tools. ANTS2 is being developed with the focus on the iterative (adaptive) reconstruction of the detector response using flood field irradiation data. The package is implemented in C++ programming language and it is a multiplatform, open source project.
Rating: 2.0/5. From 4 votes.
Please wait...

Recent source codes

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: