6712

Belief Propagation by Message Passing in Junction Trees: Computing Each Message Faster Using GPU Parallelization

Lu Zheng, Ole J. Mengshoel, Jike Chong
Carnegie Mellon University
27th Conference on Uncertainty in Artificial Intelligence (UAI-11), 2011

@inproceedings{zheng11belief,

   author={Zheng, L. and Mengshoel, O. J. and Chong, J.},

   title={Belief Propagation by Message Passing in Junction Trees: Computing Each Message Faster Using GPU Parallelization},

   booktitle={Proc. of the 27th Conference in Uncertainty in Artificial Intelligence (UAI-11)},

   address={Barcelona, Spain},

   year={2011}

}

Download Download (PDF)   View View   Source Source   

1686

views

Compiling Bayesian networks (BNs) to junction trees and performing belief propagation over them is among the most prominent approaches to computing posteriors in BNs. However, belief propagation over junction tree is known to be computationally intensive in the general case. Its complexity may increase dramatically with the connectivity and state space cardinality of Bayesian network nodes. In this paper, we address this computational challenge using GPU parallelization. We develop data structures and algorithms that extend existing junction tree techniques, and specifically develop a novel approach to computing each belief propagation message in parallel. We implement our approach on an NVIDIA GPU and test it using BNs from several applications. Experimentally, we study how junction tree parameters affect parallelization opportunities and hence the performance of our algorithm. We achieve speedups ranging from 0.68 to 9.18 for the BNs studied.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: