Learning and logical reasoning are notable functionalities of our intelligence, and PRISM has been developed as a programming language for implementing such intelligent functionalities. It is a logic programming language augmented with an EM learning capability for statistical parameters embedded in programs. It not only encompasses known symbolic-statitical formalisms from Bayesian networks to hidden Markov models to probabilistic CFGs but subsumes specific EM learning algorithms proposed in each area. In this paper, we raise the issue of approximate probability computations in hope for mitigating some limitations of the PRISM formalism. We especially pay attention to "loopy BP" which is at the crossroads of Bayesian networks, statistical physics and error correction decoding. It is a way of approximations to intractable probability computations based on asynchronous message passing. We re-examine its theoretical background, the Kikuchi approximation, and propose "cluster BP," a mild generalization of loopy BP to realize a better yet simple Kikuchi approximations. We also propose "cluster CCCP" which is a convergent version of cluster BP to obtain a local minimum of Kikuchi approximations.