Visual system design for excavator simulator with deformable terrain

Tao Ni, Dingxuan Zhao, Shui Ni
Coll. of Mech. Sci. & Eng., Jilin Univ., Changchun, China
International Conference on Mechatronics and Automation, 2009. ICMA 2009


   title={Visual system design for excavator simulator with deformable terrain},

   author={Ni, T. and Zhao, D. and Ni, S.},

   booktitle={Mechatronics and Automation, 2009. ICMA 2009. International Conference on},





Source Source   



The visual system of an excavator simulator has been developed in this paper for training human operators and evaluating control strategies for heavy-duty hydraulic machines. In such a system, the operator controls a virtual excavator by means of a joystick while experiencing realistic operating feelings through force feedback, graphical displays, and sound effects in virtual operating environments. Real-time Optimally Adapting Mesh (ROAM) algorithm is applied for the generating and updating of the dynamic terrain mesh. Considering the terrain mesh in operation field needs a more detail level than that of unexcavated region, a dynamic resolution for excavated area is used to extend the hierarchy of terrain mesh as the terrain deformation takes place. A GPU (graphics processing unit)-based, terrain-rendering methodology is adopted to create a realistic visual appearance for different operation regions on terrain such as excavated areas and dumped soil areas during the excavator simulation. The swept soil in excavator’s bucket is considered made up of large amounts of particles and particle system is applied for in charge of managing the soil particles and produce the dynamic interaction visual effect of excavator’s soil dumping. The graphic simulation of excavator is realized in a PC with satisfying frame rate up of 60 per second, which provides a low-cost solution not only for simulator of the excavators, but also for other construction vehicles.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2021 hgpu.org

All rights belong to the respective authors

Contact us: