Nonparametric Belief Propagation via Bayesian Hypothesis Test (BHT-BP) algorithm (Final update 2014 Dec.)

* Related publications

[1] Jaewook Kang, Heung-No Lee, Kiseon Kim, "Bayesian Hypothesis Test using Nonparametric Belief Propagation for Noisy Sparse Recovery," to IEEE Trans. on Signal process., vol. 63, no. 4, pp. 935-948, Feb. 2015. (preprint PDF, IEEE Xplore)

[2] Jaewook Kang, Heung-No Lee, and Kiseon Kim, "Bayesian Hypothesis Test for Sparse Support Recovery using Belief Propagation," proc. of IEEE Statistical Signal Processing Workshop (SSP), Ann Arbor, pp. 45-48, Aug. 2012. (IEEE Xplore or Preprint )

* Short Introduction to BHT-BP algorithm

To solve the above problem, we proposed an algorithm called Bayesian hypothesis test using nonparametric belief propagation (BHT-BP).

As shown in Fig.1, the CS recovery via BHT-BP takes a joint detection-and-estimation structure.

Fig.1: Diagrammatic representation of CS recovery via BHT-BP

Fig.2: nBP iteration for obtaining the signal posterior distribution

The novelty of the BHT-BP algorithm is mainly on the following two properties:

1) Providing robust support detection against additive noise based on the criterion of the minimum detection error probability:

2) Removing MSE degradation caused by the message sampling of the uniform sampling-based nBP using the joint detection-and-estimation structure,

3) Handling sparse signals whose minimum nonzero value is regulated by the parameter xmin, and proposing a signal prior PDF for such signals,

4) Providing fast sparse reconstruction with recovery complexity O(N logN + KM) where K is the signal sparsity.

List of algorithms for the experimental validation in Fig.3 and Fig.4

* Support

This work was supported by Do-Yak Research Program (NO.2013-035295), and Leading Foreign Research Institute Recruitment Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT, and Future Planning (MSIP) (2009-00422).

* Source codes download

1) BHT-BP and CS-BP (BHT-BP Copyright (C) 2012 Jaewook Kang, Heung-No Lee, Kiseon Kim)

1) Version 1 @ Jan 2015 (Initial version)

2) Version 2 @ May 2015 ( Adding some experimental setups for the figures in the corresponding paper)

- Feedback for the source codes is very welcome. Please send your comment to jwkkang(at)gist.ac.kr.

2) BCS (obtained from http://people.ee.duke.edu/~lcarin/BCS.html)

3) SuPrEM (obtained from http://people.fas.harvard.edu/~akcakaya/suprem.html)

4) L1-DS (The Dangtzig Selector) (Obtained from the L1-magic package)

* Selected References

[Baron'10] D. Baron, S. Sarvotham, and R. Baraniuk, “Bayesian compressive sensing via belief propagation,” IEEE Trans. Signal Process., vol. 58, no. 1, pp. 269-280, Jan. 2010.

[Candes'07] E. Candes and T. Tao, “The Dantzig selector: statistical estimation when p is much larger than n,” Ann. Statist., vol. 35, no. 6, pp. 2313-2351, 2007.

[Ji'08] Shihao Ji, Ya Xue, and Lawrence Carin, “Bayesian compressive sensing,” IEEE Trans. Signal Process., vol. 56, no. 6, pp. 2346-2356, June.

2008.

[Akcakaya'11] M. Akcakaya, J. Park, and V. Tarokh, “A coding theory approach to noisy compressive sensing using low density frame,” IEEE Trans. Signal Process., vol. 59, no. 12, pp. 5369-5379, Nov. 2011.