10996

Fractal Based Method on Hardware Acceleration for Natural Environments

Divya Udayan J, HyungSeok Kim, Jun Lee, Jee-In Kim
Department of Internet & Multimedia Engineering, Konkuk University, Republic of Korea
Journal of Convergence, Vol.4, No.3, 2013

@article{kim2013fractal,

   title={Fractal Based Method on Hardware Acceleration for Natural Environments},

   author={Kim, HyungSeok and Lee, Jun and Innovation, Nanyang and Kim, Jee-In},

   journal={Journal of Convergence Volume},

   volume={4},

   number={3},

   year={2013}

}

Download Download (PDF)   View View   Source Source   

2600

views

Natural scenes from the real world are highly complex, such that the modeling and rendering of natural shapes, like mountains, trees and clouds, are very difficult and time consuming and require a huge amount of memory. Intuitively, the critical characteristics of natural scenes are their self- similarity properties. Motivated by the self-similarity feature of the natural scenes that surround us, we present a hardware accelerated fractal based rendering method for natural environments. To illustrate the problem that classical geometry has in dealing with natural objects, we considered the basic fractal example as the Mandelbrot set which is a 2D structure. We examined the serial algorithm of this set and devised a parallel algorithm for implementation on a massive parallel graphics processing unit (GPU) using the computer unified device architecture (CUDA) programming model. We also considered the modeling of 3D fractals such as terrains and evaluated its performance both in terms of execution time and hardware acceleration. Performance is evaluated in terms of execution time and it was observed that a parallel implementation of the method on a GeForce GTX 650 GPU is on average 2X times faster than its sequential implementation. The running behavior of the system at various system states is also evaluated to strongly support our approach.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: